Skip to main content

Data quality problems are expensive and time consuming

Paul Newman Archive

At some point in today's dynamic business environment, companies will need to modernize, Dylan Jones writes in a recent post for the Data Roundtable Blog. For many firms, that means investing in big data solutions that will assist in the generation of actionable insight. When they go to blow the dust off of their existing databases so they can make room for their shiny, new analytics strategies, they might find something disturbing - while they weren't paying attention, their data quality might have gotten worse.

This may seem like an easy fix, as big data adopters are already planning to replace their old systems with new tools. However, poor data quality must be addressed before new platforms can be used. This is often where the true costs of negligent governance are recognized, Jones writes. There is a tangible price tag that comes with cleansing efforts, but the losses in terms of time and productivity are often greater.

To avoid issues that are as big as the data they plan to use, firms should get their quality under control sooner. This often begins with a single customer view, Henrik Liliendahl Sorensen recently wrote in an article for Liliendahl on Data Quality. This will ensure databases capture accurate information at the outset, and use that standard to search for errors and scrub additional entries.