UCLA uses artificial intelligence to create virtual radiology advisor

Machine learning application interacts with clinicians; could be applied to other medical specialties.


Interventional radiologists at the UCLA Medical Center are leveraging artificial intelligence to create a “chatbot” that automatically communicates with referring clinicians, providing them with evidence-based answers to frequently asked questions.

Currently, the AI-powered prototype is being tested by a small UCLA team of hospitalists, radiation oncologists and interventional radiologists. The machine learning application, which acts like a virtual radiology assistant, enables clinicians to rapidly access valuable information while enabling them to perform other duties and to focus on patient care.

The information is delivered in multiple formats, including relevant websites, infographics, and subprograms within the application. And if the tool determines that an answer requires a human response, contact information for an actual interventional radiologist is provided. As clinicians use the application, which is focused on diagnostic and interventional radiology, it learns from each encounter and becomes smarter through deep learning techniques that provide evidence-based answers.

“The more it’s used, the smarter it gets,” says Kevin Seals, MD, resident physician in radiology at UCLA and the programmer of the application, who notes that the application’s user interface consists of text boxes arranged in a manner simulating communication via traditional SMS text messaging services.

“It feels like you’re texting with a human, but you’re texting with artificial intelligence, so the responses are coming from a computer,” observes Seals, who has a background in engineering. “For clinicians in the hospital who aren’t radiologists, it’s a way to speak with a simulated radiologist.”

Also See: Artificial intelligence, machine learning find role in radiology

To develop the application’s knowledge, Seals says researchers fed it more than 2,000 example data points simulating common queries that interventional radiologists receive during a consultation. He adds that natural language processing was implemented using the Watson Natural Language Classifier application program interface, so that user inputs are understood and paired with relevant information categories of interest to clinicians.

For example, if a clinician asks whether placement of an inferior vena cava filter—a medical device that is implanted by interventional radiologists—is appropriate for a particular patient, they will be paired with an IVC filter category and relevant information will be provided.

Last week, Seals presented research on the application at the Society of Interventional Radiology’s 2017 Annual Scientific Meeting in Washington. While the machine learning application is focused on diagnostic and interventional radiology, he contends that it could ultimately be applied to other medical specialties.

“It’s getting really close to the point where we’d like to have a wider release across the UCLA Medical Center,” concludes Seals. “It works very well now. About 90 percent of its functionality, roughly, gets it right every single time. The difference between it working really well and it working potentially perfectly is just entering more data so that it becomes smarter.”

More for you

Loading data for hdm_tax_topic #better-outcomes...