Do you trust your data? U.S. organizations believe on average that 32 percent of their data is inaccurate, according to Experian’s most recent Data Quality Benchmark Report.
Data quality has been talked about for years in business intelligence circles. Mario Barajas, a consultant for the ZAP BI U.S. office, notes it’s a topic that won’t go away “because we’re still seeing the same mistakes from years ago.”
‘Garbage In, Garbage Out’
Experian’s research found the majority of organizations surveyed have strategies to manage data quality. However, it found that 84 percent plan to invest in additional data quality technology in the next 12 months.
The old phrase “garbage in, garbage out,” meaning your data quality is only as good as what is entered, still rings true. And it will continue to do so, despite the many strategies in place to manage data.
There Will Be Issues
Barajas urges some perspective. He writes: “The first thing we have to understand and unconditionally accept is that it’s next to impossible to have perfect data quality. The bigger the organization, the more likely there will be issues with its data. . . . From my experience in the BI industry, the bigger something is — or the more moving parts something has — the less likely everything will run smoothly. Data quality is no exception – period.”
He suggests that companies should prioritize their data efforts around those areas that are the most crucial to the business and not sweat less important ones.
Big or small, what’s driving concern over data quality in companies has changed, according to a veteran tech reporter. “While cost savings and fraud have traditionally driven data quality efforts, companies are now more concerned about how data quality affects decision-making and customer satisfaction,” Loraine Lawson of ITBusinessEdge writes. That makes sense, she observes, considering the push to expand BI and analytics and get more value from expensive enterprise applications, such as CRM.
So what should companies do to maximize the quality of their data – and leverage it for business intelligence – in a less-than-perfect data universe? Especially with the overwhelming amount of information now available in the age of Big Data?
Structuring Data Governance
Potential best practices vary based upon the size of the business. For large organizations, chief data officers are increasingly common, especially in government, banks and healthcare. Experian recommends centralization of data management under a director. Howard Dresner, president and founder of Dresner Advisory Services, recommends establishing Business Intelligence Competence Centers.
Alan Dawson, a former data governance director in Australia now with Gartner, takes a slightly different tack. He writes that it’s not centralization that is key to data quality, but companies having a real structure that works for data governance. “I suggest that rather than being a hierarchical or functional issue, it will be the social and cultural characteristics of an organisation that will be the deciding factors in determining which approach to adopt,” he writes.
Top-Down and Bottom-Up
Lawson offers a suggestion: “It might help to remember that in the end, you’re probably going to need both a top-down, centralized approach, as well a bottom-up approach — basically, you need data quality and data governance to be “all around” and comprehensive. That’s the big Nirvana of data management.”
Barajas writes that data quality is “something that should sit with the business and not with IT or the BI team.”
In a new post this week, David Loshin, president of Knowledge Strategy, outlines a strategy for ensuring data is suitable to use: Make business units responsible for applying data quality policies and use data virtualization tools to help manage the process.
Do you trust your data? And what is your organization doing about it?
Want a BI platform that fits into your application strategy? Learn more about Izenda’s modern ad-hoc reporting and analytics solution.