Creating an AI ‘Ecosystem’

Key steps include developing AI models, establishing governance processes and creating workflows.


globe-networked-107419345-adobe-712.jpg

The healthcare sector needs to create an “ecosystem” that enables artificial intelligence to flourish, says Suresh Balu, program director at the Duke Institute for Health Innovation.

“Right now, AI is in the nascent stages. A lot more needs to be developed so we can realize the full potential,” he says.

Balu compares the current stage in AI development to the time when the first automobiles were built. “Back then, there was no network of roads or bridges. You also did not have a lot of options to consider when parts failed.”


Because machine-learning models are continually changing, regulators also need to provide ongoing monitoring to ensure continued performance and safety.

Suresh Balu, program director at the Duke Institute for Health Innovation


Today, healthcare needs to build a vast AI ecosystem that enables a variety of capabilities and best practices, Balu says. Key steps include creating trustworthy and adaptable AI models, establishing data governance processes, educating the healthcare workforce on how to use AI and creating workflows for monitoring AI to ensure continued accuracy. 

How can healthcare organizations help build this ecosystem? “Some capabilities you will need to develop on your own, some you can partner to attain, and some will need to be built by regulatory bodies,” Balu says.

Role of healthcare organizations

Healthcare leaders need to determine the roles their organizations will play in AI ecosystem, says Intermountain Healthcare’s Greg Nelson, assistant vice president of analytics services.

Intermountain Health
Artificial Intelligence at Intermountain Healthcare

Intermountain is among a cadre of large health systems and academic medical centers that have invested in a data science team to develop and implement their own AI models.

“Not every organization is going to have a deep bench of experts in data science like we do,” Nelson says. “You may need to rely on partners and industry coalitions to be your DevOps (development and operations) for models. Your role may be testing, validation and change management within your organization.”


Before deploying AI models, organizations must have rigorous data governance systems to ensure risks, such as model bias or model drift, are detected, mitigated and managed.

Suresh Balu, program director at the Duke Institute for Health Innovation


Another critical responsibility of healthcare leaders is determining when and how to use AI. “The C-suite should be looking at what strategic problems they want to address,” Balu says. “The solutions do not have to include AI. Any solution that solves a specific problem with a good return on investment is what you want to go after.”

Before deploying AI models, organizations must have rigorous data governance systems to ensure risks, such as model bias or model drift, are detected, mitigated and managed, Balu says. Duke Health has established a governance process that extends across the lifecycle of machine-learning models, from model design and development through evaluation and ongoing monitoring.

Clinicians in healthcare organizations also have a key role to play in AI development and implementation. (See also: 4 Strategies for Engaging Clinicians in AI)

Role of vendors

As AI adoption advances, expect an arms race to develop between EHR vendors and companies specializing in AI, says John Lee, MD, senior vice president and CMIO at Allegheny Health Network.

“On the one hand, you have AI vendors that have a head start on the data science and algorithmic expertise,” he says. “But they have to somehow plug their solutions into workflows at healthcare organizations.

“On the other hand, EHR vendors have direct access to patient data and far more influence on transactional workflows for clinical staff and patients. But they are playing catch up in terms of hardcore data science.”

A healthcare organization’s relationship with its vendors is key to successfully adopting a commercially developed AI model.

“Executive sponsors don’t have to be experts in AI, and organizations don’t necessarily need to have their own data scientists,” says Carol McCall, chief health analytics officer at ClosedLoop, a predictive AI company. “They can leave the driving to us.”

At a minimum, however, organizations need to supply two pieces of information when purchasing a machine-learning model, McCall says. “They need to be able to explain what data they’ll be sending us and where it comes from. Even more, they need to know how they’re going to use the answers the program gives them. Knowing that helps us ensure the model’s predictions are designed to power up the actions the organization wants to take.”

Role of regulators

Some AI issues will need to be addressed by regulators, Balu says.

One example is reviewing the safety and efficacy of specific machine-learning solutions before they go to market. Because machine-learning models are continually changing, regulators also need to provide ongoing monitoring to ensure continued performance and safety, he adds.

As of September 2021, the U.S. Food and Drug Administration had reviewed and provided information on more than 300 medical devices enabled with AI/machine learning. The FDA is also considering developing a lifecycle-based regulatory framework for these technologies, which aims to address the changing nature of machine learning models. In addition to a pre-market review, the FDA would issue recommendations on how often a model needs to be assessed and updated.

Meanwhile, the FDA and regulatory agencies in Canada and the United Kingdom released 10 guiding principles in October 2021 for the development of good machine-learning practice.

Role of industry coalitions

Several collaborative partnerships and coalitions have been formed to help with the creation of an AI ecosystem.

For example, in January, the Artificial Intelligence Industry Innovation Coalition was announced. The coalition brings together healthcare providers, vendors and others with the goal of providing recommendations, tools and best practices for AI in healthcare.

“This coalition is going to be important because it gives us a voice for how we should be thinking about standards, processes, and best practices, nationally,” says Nelson of Intermountain, which is a member.

Identifying and adopting best practices will be important as healthcare organizations navigate AI implementation.

“Historically, many vendors have used AI models as their secret sauce,” Nelson says. “My hope would be that the algorithm is not the special thing, but what gets special is how you operationalize that model in your environment, tune that to your patient populations and integrate it into the system of care. That’s going to be the secret sauce – and that’s the hard stuff.”


Stories in this series

Self-driving cars and hospital-rounding robot residents

4 Strategies for Engaging Clinicians in AI


Related content

AI and other tech show promise in democratizing healthcare

CIOs step into new roles to assess a slew of digital tech 

AI shows promise for better outcomes at Michigan Medicine

IBM sells its stake in Watson Health to investment firm

AI in Practice initiative shows potential of advanced computing

Rady Children’s fast genomic sequencing initiative gains steam

More for you

Loading data for hdm_tax_topic #better-outcomes...