The push for interoperability gains new urgency in 2019

Regulatory pressure from CMS is expected to rise, with the intent of making inroads in improving data exchange between providers.


Now that most providers have implemented an electronic health or medical record system to comply with federal Meaningful Use requirements, the government has turned its attention in healthcare technology to interoperability. In fact, last April, the Centers for Medicare and Medicaid Services renamed the program—it’s now Promoting Interoperability.

It makes sense to shift the attention to data exchange, experts say. “There’s no need to measure things people are already doing universally,” says David Harlow, a lawyer and consultant who specializes in healthcare business and regulatory issues.

But those experts also say it’s not so universal to actually use the data within EHR and EMR systems—along with other sources such as patient portals, connected devices, sensors and apps—to deliver meaningful and actionable data to improve outcomes across the care continuum and at the point of care for populations of patients. Although there are hurdles to overcome, the industry—driven by new regulations, incentives and the shift to value-based care—is definitely moving in that direction.

The technology is advancing rapidly, so we’ll have the capability to do things in healthcare that we couldn’t do five years ago with our EMRs and our HIEs,” says Brian Ahier, a board member of the not-for-profit Sequoia Project, which works to advance the implementation of secure, interoperable nationwide health information exchange. “I think 2019 is going to be a good year for interoperability.”

In addition to the Promoting Interoperability program, the final rule of the government’s 21st Century Cures Act, a law signed by President Obama and supported by the Trump Administration, was expected to emerge near the start of the New Year. The Cures Act is designed to speed medical advancements, in part by penalizing egregious information blocking, although regulators haven’t established specific penalties for such interference with the free flow of information.

They’re coming, though--along with other government interoperability programs that are on horizon--and the industry must get ready. “At this stage of the game, we have the tools that allow for widespread availability of information in the data-blocking context, so provider to provider and … to the patient, it should be universal at this point—but it’s not,” Harlow says.

Incentives needed
Another regulation pushing the industry toward interoperability is the Medicare Access and CHIP Reauthorization Act of 2015 [MACRA], which empowered CMS to create alternative payment models, such as accountable care organizations. The shift from fee-for-service to pay-for-quality gives payers and providers a financial incentive to share data.

“The Obama administration and the Trump administration don’t have a whole lot in common,” Ahier says. “In policy perspectives there’s a pretty big chasm between them. But not in this regard. They’re all in on value-based care.”

The industry can’t achieve the vision of value-based care without interoperable health IT, Harlow adds. “It’s just not going to happen.”

While data may flow freely within large organizations, a surge of mergers and acquisitions has left providers with a variety of systems that don’t necessarily talk to each other, creating “a potential interoperability problem that’s going to work against you in terms of value-based care management,” Harlow says. “The financial relationships in the value-based care models need to accommodate the idea that providers inside and outside those walls may need to work together.”

“But until those financial incentives are firmly in place, the business disincentives to interoperability and data sharing linger,” says Wen Dombrowski, MD, a physician informaticist and chief convergence officer at Catalaize, a consulting firm that advises healthcare organizations, technology companies and investors on innovation and emerging technologies. “Large health systems competing for market share in the same region struggle to make a business case for sharing the data they’ve gathered.

“It’s reasonable to make enough money to keep the doors open, keep all the nurses employed and things like that—I understand the rationale behind it,” Dombrowski says. “But as long as we’re locked into a profit-driven kind of economic structure, I don’t think interoperability will progress much, regardless of what CMS, ONC and other bodies promote and regardless of all the statements and promises. Fundamentally, if there’s not an overwhelming incentive for healthcare organizations and vendors to be interoperable, then it’s not going to happen.”

And until healthcare does decide it wants to share, “achieving the clinical, research and practical benefits is going to be stunted,” Dombrowski says. “Because there are already elegant data management tools and AI algorithms developed, a lot of products and solutions are already possible. No one has to invent these per se. But the fact is, if the data is not easily accessible in a fairly real-time manner, then the projects that organizations can do are limited in terms of the promise and magic of predictive analytics and machine learning.”

Value-based care
Rob Barras, vice president of health solutions at CTG, an IT advisory and services company, says ideally healthcare organizations would want to share data for altruistic reasons—to improve clinical care—but realistically, the potential of data sharing to optimize value-based care agreements is ultimately what will drive interoperability.

“You cannot make decisions about a population until you have the data you need,” especially at the point of care, Barras says. “That’s the real value of integrating all that data in one place: so we can understand the health of the population as opposed to the individual.”

The biggest barrier right now, he adds, is that most organizations are just beginning to understand the requirements of value-based care. And IT leaders, in particular, still need help determining how to use technology to make it happen. “If IT doesn’t understand it, if the CIO doesn’t understand it, what are the chances the chief nursing officer or the VP of population health understands it?” Barras says. “There are still a lot of heads in the sand that we’re living in a fee-for-service world.”

Ahier agrees, but says lack of understanding and denial won’t cut it much longer. “Providers stuck in fee-for-service or transitioning into these new payment care delivery models have one foot on the dock and one on the rowboat, and the rowboat is going out to sea. They’re not going to let you get back on the dock,” he says. “That’s where I see the most complicating factors for the boots-on-the-ground, in-the-trenches folks; the informaticists, the physicians, the nurses, the clinicians of all stripes who are caring for patients in the value-based realm.

“Fortunately,” he adds, “there’s a lot of great technology that can help them do that.”

But the technical issue is currently being solved at the minimal dataset level, according to Harlow. “It should be solved at a broader level and that should enable value-based care,” he says.

“Interoperability is not an end in itself. It drives better value-based care along a number of dimensions. It’s about provider payments, it’s about outcomes and it’s about care management—the process of care. Along all those different dimensions there should be incentives to provider organizations to up the ante on interoperability.”

Some larger EHR companies, including Epic, are working toward those goals by creating interoperability functions within their own systems. And then there are companies, such as Cerner, that offer care coordination solutions.

But a market is also emerging for purpose-built, vendor-agnostic solutions, mainly from small companies that are focused on care coordination and sharing data across the care continuum, such as CareCentrix And PatientPing.

These companies “offer very important capabilities, which enable the entire healthcare network to stay connected with each other,” says Koustav Chatterjee, a Frost & Sullivan healthcare industry consultant and analyst. “As patients come in with more complicated conditions, it becomes increasingly difficult for the payer and provider to offer that holistic, evidence-based clinical judgment to that patient at the point of care. And that’s why they don’t have any other option but to reach out to ... specialist healthcare IT vendors who are offering the bandwidth and capabilities so that everyone wins.”

The market, he adds, “is perfectly poised to aid the next revolution of healthcare IT” and the move toward interoperability.

Still, healthcare leaders, especially CFOs, are often hesitant to spend money on additional value-based services. They think the EHR “should be able to do everything,” including coordinating care and reporting quality metrics, Chatterjee says. But because every payer has a different set of guidelines and protocols, “it’s literally impossible for an EHR to perform all those activities for that complex patient population.”

In some ways, the barriers to interoperability aren’t technical at all. There are plenty of tools and data sources to mine, including sensors, connected medical devices, patient apps and wearables,that are improving care quality right now.

Sensors, in particular, hold tremendous promise for the healthcare industry, which can benefit from auto industry expertise in this area.

“There’s a lot of applicability in terms of what’s being developed on the technological front with self-driving cars,” Dombrowski says. “It’s an example of where you’re taking a large amount of data from many sensor sources and AI algorithms analyze it in real time—instantly—and it generates a recommendation or action that has life or death consequences; it has a high level of criticality.”

And in the auto industry—as in the healthcare industry—technology that creates more responsive algorithms is key “so it doesn’t take two minutes to analyze and react to what happened two minutes ago,” she adds.

Other non-technical approaches that can move data-sharing forward include data governance; better trained, more efficient IT departments; and input from patients—an important group that’s sometimes overlooked in the interoperability conversation.

While government incentives and the natural course of technological advances have largely eliminated the paper silo, these efforts have created a digital data silo, says Ahier.

Data governance’s role
Data governance is one way to break down the walls.

Getting a handle on data is key to interoperability because the way electronic information is gathered and reported can vary widely. The definition of length of stay at one hospital may be different at another, for example, or the way clinicians record blood pressure readings may differ from site to site. Such variations—and the fact that the industry is unlikely to ever agree on one system or one set of recording methods—are part of the reason many organizations struggle to adopt analytics tools.

“You’re not going to break down all those silos, so then what?” Barras says. “Data governance is the ‘then what.’ ”

With value-based care, organizations often struggle to identify who “owns” the process of gathering and sharing data, Barras explains. Some think it’s the chief medical officer, the vice president of quality or the population health director. But the correct answer is “all of the above” and more. “That’s what a strong governance platform will provide—that clarity that we all own it,” Barras says, noting that the time to put data governance in place “is now.”

Many organizations are spending a lot of money on interoperability systems, data warehousing and data analytics, Dombrowski says. “Frankly speaking, all that cost could have been prevented if there was cleaner, interoperable data upfront.”

IT staff’s role
Another challenge is recruiting experienced IT staff who can build upon industry best practices and who are adept at semantic interoperability and data mapping that help achieve higher levels of information exchange, for example.

“A lot of organizations are trying to reinvent the wheel,” Dombrowski says. “They might have either junior teams or experienced teams, and they are trying to build things from scratch.”

That’s costly and time-consuming, she says.

“It takes a lot of effort, and if the team is not experienced, software and data exchange systems could be built or set up incorrectly. We see that over and over again, where the technical system … is a little off.”

For example, errors may be present in the patient portal, even if the information is correct in the medical records. “If they can’t even get my first name correct, then how do I know these labs are mine? This is such a basic example of interoperability,” she says.

Semantic interoperability, too, is a big challenge in healthcare, where vital stats like blood pressure readings can be reported in different formats not only from one system to another but from one department to another, she adds.

Because information exchange is about more than just moving piles of data from point A to point B, data mapping is a critical IT skill. “It’s not as simple as just dumping all the data into one database. That data still might not be able to talk to each other,” Dombrowski says.

Providers must have teams in place that not only understand these barriers to interoperability but also have the ability to solve them.

Patient participation
Ironically, another ingredient missing from the data-sharing mix is the very reason healthcare needs to do a better job of interoperability: the patient.

Even government regulations have shifted away from broad patient participation, Harlow says, amid complaints from clinicians that patient engagement in healthcare data is out of their control.

“I still hear many stories from people who can’t get access to their records,” he says. “It’s surprising that it’s still this difficult. The rules are clear, but there are institutional practices in place, and some of them are still at odds with the rules.…The idea that the patient controls the distribution of information seems to be lost on a lot of people.”

HIPAA, he adds, ends up being used as an excuse not to share records.

Dombrowski also advocates for a patient-centric approach.

“A lot of the discussion about interoperability centers around transmission of EHR records,” she says. “A lot of it is about interoperability of the whole EHR or a small subset, such as a discharge summary or problem list. As a clinician, having that basic summary is useful, but it’s not sufficient to capture the whole narrative about a patient and what’s happened over time.”

Especially for patients with complex medical cases, interoperability needs to go “much further,” she adds.

Although the concept of patient-generated health data is gaining traction, solutions to harness it don’t always make sense. “Most patient portals are dead ends,” Dombrowski says. “The information that patients enter doesn’t always go into the EHR or get pushed back to the doctor. If you’re a patient who is really bored, you can type your information into a computer … but it has no impact.

“I would go beyond what you can document in the portal and actually get the patient-centered health data in the form of many different apps,” she adds. That includes basic health apps such as those that track pain or fertility to fitness apps and trackers.

Bottom line, to treat populations of patients, providers must make decisions based on information about those populations. And that requires interoperable data sharing.

“Once all stakeholders understand that you can’t make decisions about a population you’re trying to manage with only parts of the data,” Barras says, the healthcare industry will find ways to overcome any remaining barriers to interoperability.

More for you

Loading data for hdm_tax_topic #better-outcomes...