9 Tips to Improve Data Quality

data quality

What are your company’s most important assets? Real estate/locations, equipment, inventory, employees and intellectual property are undoubtedly at the top of your list. What often gets overlooked, however, is one of the most valuable corporate assets – data.

Data is the backbone of operations, driving critical financial and nonfinancial decisions at every turn. With so much at stake, having accurate, complete and consistent data is paramount. The consequences for bad data can be steep, ranging from improper payments and business process delays to poor decisions or even fines for regulatory infractions.

Good data is the first step on the road to success. Here are nine tips to improve the quality of your data:

1. Know your data – Do you know what data you’re collecting, why you’re collecting it and where it comes from? Collecting data you don’t need can be expensive with processing fees and storage – and it can make it that much harder to accurately track what you do need. Be as forward-looking as possible, and focus on the data necessary to accomplish your goals. And make sure that every component is coming from a trusted and knowledgeable source.

2. Validate data automatically – Validate data as it is entered by automatically flagging missing, inconsistent or unexpected information. It also is important to ensure that data fields, calculations and formulas are tested for accuracy and consistency across all points of entry.

3. Correct data issues at the source – The quality of your data is only as good as what’s feeding into your system. If you discover problems with incoming data (incorrect financials, errant codes, missing data elements, etc.), it’s important to go all the way back to the original source to make corrections. The easiest way to do this is to establish a specific contact at each vendor or other data provider who can resolve issues as they occur. You also might consider scheduling regular meetings to keep things running smoothly.

4. Be especially watchful when bringing in new data sources – Onboarding data from new sources requires an even higher level of vigilance to protect against improper data conversion or data loss. For instance, if claim numbers change from system to system, map the original value to new fields in the current system to preserve the history. Converting financials to a new system is especially tricky. Make sure you have a firm grasp of how your current financials are tracked and carefully translate that logic to the new system. And ensure proper location structures and code translations are in place.

5. Say no to free-form text wherever possible – Limit free-form text to names, addresses, short descriptions and notes – and use codes/lookups everywhere else. Even the most conscientious person is bound to make occasional mistakes when entering free-form text data. Using coded fields improves accuracy, facilitates reporting and ensures consistency across the board.

6. Consolidate systems – One of the biggest mistakes companies make is to use multiple systems to track data. Maintaining separate systems is not only expensive, but it causes problems with workflow, data consistency and reporting. Consolidating data into a single system gives you much more control over the integrity of data coming in and going out.

7. Have a single administrator for your system – Don’t let just anyone make changes to your database setup. The responsibility for adding or altering fields, codes and locations needs to be limited to a single admin user (or user group). Any potential changes to your database should be reviewed and approved through a clearly defined change-control process.

8. Run reports frequently – Reports are the last checkpoint to verify data integrity. Regularly run a core set of baseline reports to review your data, identify anything unexpected – and, of course, go all the way back to the source to make any needed corrections.

9. Seize the day! –Data issues get more difficult to correct over time. Even relatively minor problems can quickly snowball out of control if not promptly addressed. Take action now! You’ll be glad you did.

Jay Walkington Marsh ClearSight

Jay Walkington is Senior Manager, Professional Services – Data. Jay and his team develop collaborative data solutions that help risk managers achieve clear line of sight to their risk. He drives best practices and standards for the overall Data Operations group as well as consulting with clients on system design, developing custom data solutions and supporting ongoing client needs. The Data Operations team is responsible for both new projects and ongoing data services for all Marsh ClearSight clients.

 

Marsh ClearSight
Marsh ClearSight helps clients understand and manage their risk so they can reduce their total cost of risk, improve organization’s operational performance, and protect their reputation.

A business unit of Marsh LLC, Marsh ClearSight is the technology industry leader for providing support to more than 750 customers in 25 countries and has a trusted data store of over 60 million claims amassed through decades of operation. With the industry’s single largest risk database, Marsh ClearSight uniquely enables its customers to accurately analyze trends, gain industry insights, optimize decision-making, and reduce costs across the entire risk ecosystem. To learn more about Marsh ClearSight, click HERE.