Skip to main content

Massive amounts of data remain unanalyzed

Rachel Wheeler Archive

Organizations, both public and private, are collecting more data on their customers now than ever before. Health firms, insurance companies and government services such as tax collectors are gathering information on numerous consumers, especially through online forms, and they're amassing it in large databases.

This massive growth of data holds tremendous potential across the business world, but today's chief information officers have plenty of questions to answer before they can tackle tomorrow's real-world problems.

Trusting our data
Data quality is one major concern for companies that collect large volumes of information, especially when it's all coming from disparate sources.

According to Science Daily, 90 percent of all the data in the world has been generated in the last two years. As people gain more ways of keeping in touch with public and private organizations, including mobile devices and social networks, it becomes more difficult for information overseers to keep everything straight.

Petter Bae Brandtzæg, an analyst at SINTEF ICT, believes that the rise of numerous channels  for data mining - including Facebook, SMS, e-mail, blogs, Twitter and Instagram - are making it more difficult to keep large banks of information uniformly accurate. Information officers are striving for consistency.

"We will look at various sources in relation to each other, and for example find out how trustworthy Twitter messages are," Brandtzæg told Science Daily.

Because there are so many channels out there, information is becoming fragmented, and the challenges for companies are building.

Capitalizing on potential
What's more, companies need to do a better job of using all the information that's in their databases. Simply collecting data and stockpiling it is not enough - analyzing it is the next step. Without analysis, collection is all for naught.

According to The Guardian, the world is analyzing less than 1 percent of its data. The global data supply reached 2.8 zettabytes - that's 2.8 trillion gigabytes - in 2012, but a report from the Digital Universe Study revealed that only about 0.5 percent of it is used for analysis. By 2020, the total volume of data should be around 40 ZB, and there's little indication that the world's analytical capabilities will keep pace.

"As the volume and complexity of data barraging businesses from all angles increases, IT organizations have a choice," Jeremy Burton, executive vice president of product operations and marketing for EMC, told the news source. "They can either succumb to information-overload paralysis, or they can take steps to harness the tremendous potential teeming within all of those data streams."

That choice will have a profound effect on IT's growth in the years ahead. Data offers countless rewards, but information leaders must be willing to cash them in.