Why interoperability will get worse before it gets better

Many competitive factors complicate data exchange initiatives, and that should incentivize hospitals to make better use of the data they already have.


We hear the same thing from the health IT community every year: We’re committed to enabling seamless health data sharing. It’s the industry’s perennial commitment that electronic health records (EHRs) will soon share patient data across different platforms to ensure coordinated, high-quality care. Walls will come down in the name of better patient outcomes.

Unfortunately, none of this is going to happen any time soon. In fact, I predict our industry’s struggle with interoperability will get worse before it gets better. Here are three reasons why:

There are too many competing siloed solutions. There are many EHRs and other data analytic tools in the market, each with countless one-off solutions being built around their insular, proprietary systems. From supporting revenue cycle management to care coordination and virtual visits, these vendors are all vying to become the superior “Swiss Army Knife,” with little thought of integrating with other solution providers. Despite standardization initiatives—as this trend continues—the current health IT landscape is getting more, not less, fragmented.

There is no economic incentive to make systems talk. This is an elephant in the room that few seem willing to acknowledge. While there’s no question that sharing patient data with outside providers is the right thing to do, we have to recognize that it’s an added cost for already cost-sensitive health systems. On top of that, data analytics and EHR companies have an economic incentive to keep customers locked into their proprietary platforms and databases. Until there is a clear ROI for opening up data and investing in integration, any commitment to interoperability will remain a well-meaning slogan.

Technology is evolving faster than our ability to connect new systems. The emerging tsunami of smart and connected devices in healthcare is only going to further complicate the notion of interoperability. All of these devices will be producing more data to ingest, analyze and interpret. And most of the devices will come with their own proprietary platforms, which will make the current environment even more dizzying.

As a data scientist who advises on healthcare analytics for hospitals, health systems and academic medical centers of all sizes, I don’t mean to disparage the industry’s efforts to establish interoperable systems or the great work that’s been done towards promoting technology standardization. I believe we will get there—someday. New technologies such as blockchain are emerging that may even offer hope of leapfrogging the interoperability gap in which we find ourselves.

Until that day comes, however, hospital leaders need to be pragmatic about leveraging the data they can access to drive quality and cost improvements in the here and now. Yes, that means perhaps limiting yourself to the data only you have control over rather than waiting for perfect data or heading down an expensive path to get multiple data streams to talk.

I’m aware this recommendation doesn’t quite gel with the industry’s shift toward population health management, which hinges on understanding the complete patient care journey across many different providers and systems in order to effectively manage cost and risk. That’s not to mention integrating things like behavioral health and social determinants into the care equation.

But until this data is high-quality and easily integrated with the hospital, I believe this segment of the care continuum should be a secondary focus. Considering that many post-acute facilities for hospice, long-term and skilled nursing care don’t currently have electronic records, let alone systems that can integrate with hospitals, your organization’s efforts are likely to yield greater return elsewhere in the near-term.

Despite our industry’s interoperability woes, here’s the good news: From my experience working with hospitals, I’ve found that 80 to 90 percent of a patient’s cost across the continuum of care is associated with data that hospitals already have under their complete control. So, how can hospitals make the most of the data they already have access to? Here are three steps that I have seen result in exceptional improvements for hospitals in care quality and costs:

Clearly define service lines, then compare utilization and outcomes against top-performing peers. They may not know it, but most hospitals are already collecting the data needed to evaluate utilization of lab tests, diagnostics, drugs and ancillary services, which I’ve found to represent key savings opportunities for hospitals. By analyzing the utilization of all services by clinical conditions and assessing performance against top-performing peers, reducing clinical variation can be greatly simplified.

Using this approach, for example, I was recently able to show a hospital that they were significantly over-utilizing intra-operative monitoring for spinal surgery patients compared to similar organizations. The leadership was told months prior, from the neurosurgeons, that they were using this service because it was standard of care. Now, with data in hand, the hospital’s leadership was able to have an informed conversation with the neurosurgeons—armed with examples of prestigious hospitals that were not using intra-operative monitoring for that particular clinical condition. This eventually led to a significant cost reduction without hurting care quality.

Build a culture of acting upon data. It’s a missed opportunity when a hospital has widespread collection of data, but no processes in place to translate what the data means and make it actionable. In that regard, the most disruptive force in health analytics, I believe, won’t come from technology. It will come from users getting smarter about using data.

To begin building a culture that values data-driven action, it’s important to engage multiple stakeholders, especially physicians. Administrators and health IT professionals must involve physicians in deciding how data will be used to track performance and measure outcomes. Securing their buy-in and developing specific metrics for various service lines will help everyone align on opportunities for cost reduction and quality improvement.

Embrace transparency, even if it’s scary. Admittedly, another reason I’m skeptical about interoperability is that, often, even the departments within a hospital or health system aren’t communicating. How will we ever get anywhere if that doesn’t happen? Openness is key to making good use of data. It takes some getting used to, but the performance of individual departments must be reported organization-wide. And the hospital’s leadership not only needs to be on board with this – they also need to offer solutions and commit resources to solve quality issues that emerge from improved transparency.

What the above strategies all have in common is they require significant organizational changes that may face resistance. This perhaps illustrates a deeper truth about interoperability that many in the industry have not yet acknowledged—that using data to improve quality and lower costs is not, in fact, a technology issue. The real barriers are cultural, human. Most of the technological capabilities we need already exist; learning to use them collaboratively toward a common goal is the real challenge.

More for you

Loading data for hdm_tax_topic #better-outcomes...