How to start implementing AI

A bottoms-up, grassroots approach can kick-start the use of advanced technology by gaining experience with proof-of-concept pilots.


Artificial intelligence-led services, among others, are already permeating our lives, with many more business use cases being analyzed and new technologies developed.

As rapid advances begin to change industries, markets and the competitive landscape, how can a healthcare organization explore whether AI—and its branches of machine learning and deep learning—makes sense for implementation?

There's a lot of buzz, there are plenty of gray areas. Many executives are under the impression they'll have to invest in AI to stay competitive, but they don't yet know how AI would fit into their organization's business model. At the same time, there’s a plethora of companies, both established (Google, Microsoft, Amazon) and entrepreneurial (H2O.ai, DataRobot, Skytree) ready to help organizations attack problems with open-source and proprietary tools and methods and arrive at an informed recommendation for investment.



Where do you begin? First, you don’t need to understand everything about concepts like neural networks, Bayesian network inference or regression to get going. Start with an exploratory perspective and open mind.



Here are five recommendations for preparing to test drive AI in your organization:

• Consider AI when building your strategic goals. Look at AI as a means to an end—not the end in itself. AI should advance a strategy, not dictate it.

• Align projects with tangible business goals. Does an organization you aim to improve call center service? To reduce employee time spent on repetitive, manual processes? Create a list of applicable use cases with clearly defined success criteria.

• Gain the agility to pivot as needed. Having a centralized analytics team or innovation hub can help an organization hone valuable skills and stay abreast of advances in AI technology. Consider the balance of analytical versus domain expertise and understand the limits of an organization’s technological computing power.

• In this explorative phase, get to know AI players in the marketplace. What do they bring to the table? Attend a conference or symposium to see who's out there. Connect with other groups that are open to exploring these emerging technologies and share lessons learned.

• Choose a focused pilot project and the right approach for the situation (i.e., proof-of concept). For example, one healthcare organization might look into predictive outcomes of a care practice within a therapeutic area. Another organization might pick an operational function and look at current work processes with high, manual touchpoints to see where there are opportunities to gain efficiencies.

Know what’s in your garage

More than once, each of us has tackled a home improvement project or just ventured out to our garage looking for that one tool or gadget, only to realize that we don’t know what we do (or don’t) have in our garage. A conversation about AI would not be complete without a discussion on data. It’s never too late to start understanding and cleaning up an enterprise’s “big data”—what’s been collected and amassed for all the various priorities in its portfolio. What can be done today is to make that data readily accessible: label, curate and make it meaningful. After an organization has decided to take on a pilot AI project, data will be like gold. It is the fuel for the algorithms just as gasoline is for automobiles.

Operationalize AI from the bottom up

Take a bottom-up, grassroots approach to AI by putting it to the test with proof-of concept pilots. The end goal is to learn: test hypotheses, evaluate internal analytical abilities, assess potential AI partners and their tools and methods, and innovate where it’s not been done before.

Prep, test and launch

It provides both an internal and external lens to validate the benefits of AI as it relates to an organization’s goals and objectives. It applies a "bake-off" approach to understanding internal and vendor analytic skillsets, while enhancing the team’s understanding of AI.

Organizations that adopt the “Prep, Test and Launch” framework begin to think differently, operate with agility and really learn what matters most. They bridge the gaps between value-added AI capabilities, concept pilots and strategic execution. They gain the insight to inform strategy—and overall AI strategy.

In the long run, incorporating AI may not bring on just technological changes; there’s likely to be a cultural change attached. Starting with a bottom-up approach and being transparent about what the organization learns from the proof-of-concepts can help build broader understanding and buy-in.

At a time where there are still a lot of unknowns in the AI landscape, there is plenty of opportunity to get ahead of the game and stand out from the crowd. By effectively using AI tools and methods that are readily available right now, an organization may be able to accelerate the path to achieving its long-term objectives.

With a bottom-up approach, HIT executives can create evidence-based learning experiments that provide new insight to both internal and vendor capabilities. Equipped with the right use cases to solve, they’ll understand the value AI offers and determine whether it should play a role within the organization’s larger mission and vision.

As always, let organizational goals be the guide and look for passionate adopters who are innovative, ready and willing to evolve.

Emily Moro is a principal with Point B, an integrated management consulting, venture investment and real estate development firm.

More for you

Loading data for hdm_tax_topic #better-outcomes...