Skip to main content

Poor data quality can compromise big data's promise

Rachel Wheeler Archive

Businesses have been abuzz about big data's potential throughout 2012, as the tool became accessible to a wider range of smaller companies. However, big data's potential could go unrealized if companies don't also invest in the necessary data quality tools.

Poor data quality is the Achilles' heel in supply chain management, Adrian Gonzalez wrote in a recent article for Logistics Viewpoints. That's because data can often be outdated, entered incorrectly or left out altogether, which can lead to errors. In fact, data cleansing is often cited as one of the most challenging aspects to implementation, with CIOs reporting that their IT teams spent half of their time scrubbing and fixing data to reduce errors and inaccuracies.

Data quality issues can also wreak havoc for marketing teams that invest funds in mailing campaigns which target incorrect or outdated addresses. These are also issues for retail businesses and other corporations that must verify identities in loyalty programs or when processing payments, according to a recent blog post by data architect Henrik Liliendahl Sorensen.

Liliendahl writes that identity verification will be even more important in the future, as companies must differentiate and verify identities in hardcore checks for employment processes and criminal investigations, come up with lightweight resolutions, such as matching names with addresses, and parse digital identities with traditional data.