Ensuring data quality offers challenges in provider organizations
One of the biggest challenges of data management is ensuring data quality—and for many organizations, this remains a struggle.
This issue looms particularly large in healthcare, where provider organizations typically collect information from a variety of information systems, which can affect data aggregation. And the quality of data collected may present life-and-death consequences in treating patients.
Large organizations “always face data quality issues, but often do not understand the extent of the challenge until they attempt to compile data,” said Don Loden, managing director of data management and advanced analytics and business intelligence at Protiviti , a global consulting firm.
“This scenario becomes apparent when (organizations) take on analytics projects to combine disparate data from multiple systems,” Loden says. “The reason that this phenomenon exists is that data quality constraints will never be consistent from system to system. That holistic design to support quality data across systems is intrinsic for cross-platform analytics design, but effectively out of scope for a single-source system alone.”
Organizations can use tools in a reactive manner to fix and respond to data quality after the data is collected and created by systems of record, Loden said. “This is done either within the analytics platform or by some master data-driven system,” he says.
Another option is to use proactive tools to correct, cleanse and standardize data at the time of creation.
“Reactive efforts often lead to proactive efforts, as the latter is more complex and expensive,” Loden says. “Reactive efforts are often a good way to understand the extent of the data quality problem and the relative value that can be produced by solving it.”
In the future, Loden expects to see traditional data quality and data exploration tools supported by techniques such as machine learning, in an effort to increase the reach and productivity of these products. “This will drive the fixing of data, over waiting to find data of suspect quality before fixing it,” he says. “Vendors are starting to shift in this direction, and I would expect this shift to accelerate.”