After spending considerable time over the last few weeks on designing a compliance process to measure the quality of information, I want to share a few observations.Feedback about how companies are addressing data quality through Six Sigma would be most helpful.

Companies are approaching an inflection point: exploiting information is their next lever to innovate or become efficient. Chief Executive Officers believe information technology (IT) can be a competitive differentiator. Cost savings initiatives in the information systemsfunction are winding down because of diminishing returns. The volume of corporate data is on the rise. Opportunities for growth and diversification may arise through how companies exploit their Information. This depends on understanding quality of information — starting with the basics of how to measure data for defects — and setting a foundation to improve its reliability.

This inflection point is not lost on control, risk and compliance executives. Sarbanes Oxley increased the scope of internal controls testing, and audit fees skyrocketed. While audit costs are coming down, new data-centric threats are emerging: states’ privacy legislation, consumer protections, international regulations and other corporate mandates are placing new demands on how companies collect, analyze, control and disclose information. And data quality is a challenge on multiple fronts ranging from how companies accrue liabilities to model their operational risks. Today, many companies lack a framework to measure data quality through a “mandate” lens and extrapolate a corporate data quality index.

Business process management can help. One form of business process management is a “current state assessment” to ease a department into process improvement with DMAIC or organizational transformation with Design for Six Sigma. It goes something like this: document process flows, inputs and outputs; understand voice of customer; propose metrics and collect data; and define opportunities for process improvement. In situations like this, achieving a state where executives receive useful information about process performance is often a significant achievement.

Companies can tailor business process management framework to understand data quality:

  • Start by understanding your important corporate mandates that depend on data quality. For a credit card company, these mandates might include financial reporting, credit risk and management reporting about trends that would signal predatory or discriminatory practices.
  • For each of these mandates, document their information dependencies (i.e., the critical data elements needed to obtain information to fulfill the mandate). Determine where data are provisioned within the company, and whether they are created by upstream business processes or obtained from customers or third parties.
  • With this knowledge, determine criteria for “good” or “bad” data — criteria for completeness, accuracy, consistency, reasonableness, and other relevant quality dimensions. Embed these criteria in query software to measure data, use your method of choice to calculate data quality, and begin spending time on root cause analysis.

A variety of formats is available to present measurement and analysis. Consider modifying statistical process control charts to track data quality. Create a simple scorecard to summarize trends by showing critical data elements on one axis, “mandates” on another axis, and data quality yield rates by mandate in cells. Feedback on other approaches, methods and presentation tools is encouraged.

Over the coming months, I look forward to expanding dialogue about applying Six Sigma to information management and data quality.

About the Author