AI accelerates pneumonia diagnoses in study at Intermountain
The use of artificial intelligence can enable radiologists to spot key findings in chest X-rays of suspected pneumonia patients in the ER within 10 seconds.
That finding, by Researchers at Intermountain Healthcare and Stanford University, suggests a huge improvement from previous averages of 20 minutes or more, suggesting that the use of artificial intelligence can enable treatment to start sooner.
The researchers studied the CheXpert system, an automated chest X-ray interpretation model built at Stanford, to review images at several emergency departments at Intermountain hospital in Utah, and reached consensus that key findings were identified very accurately and the 10-second response rate significantly outperforms current clinical practice.
“CheXpert is going to be faster and as accurate as radiologists viewing the studies,” says Nathan Dean, MD, principal investigator of the study. “It’s an exciting new way of thinking about diagnosing and treating to provide the very best care possible.”
Also See: Groups release statement on ethical use of AI in radiology
CheXpert was created by the Stanford Machine Learning Group, which used 188,000 chest imaging studies to create a model that can determine what is and what is not pneumonia on an X-ray.
Because patient populations are different because of geographic locations, CheXpert then was fine-tuned for Utah by reading another 6,973 images from Intermountain emergency departments.
“We’ve been developing deep learning algorithms that can automatically detect pneumonia and related findings in chest X-rays,” says Jeremy Irvin, a doctoral student at Stanford and a member of the research team. “In this initial study, we’ve demonstrated the algorithm’s potential by validating it on patients in the emergency departments at Intermountain. Our hope is that the algorithm can improve the quality of pneumonia care at Intermountain, from improving diagnostic accuracy to reducing time to diagnosis.”
At Intermountain emergency departments, radiology reports are run through the Cerner Natural Language Processing system, or NLP, which is a support tool used to get needed information from the radiologist report. NLP then sends the information into another system, ePNa, an electronic clinical decision support tool that is part of pneumonia care at Intermountain.
For most emergency departments where ePNa is not available, the CheXpert model could provide the information from chest X-rays directly to doctors, explains Dean. “Using the CheXpert system, we found the interpretation time was very swift and the accuracy of reports to be very high.”
In the study, radiologists categorized 461 patients and assessed them through four metrics on how likely they were to have pneumonia. They then assessed images believed to show the disease in multiple parts of the lungs and whether the patients had fluid build-up between the lungs and chest cavity.
Overall, researchers found the CheXpert model outperformed the current system of using a radiologist to create radiology reports for all important pneumonia findings, plus NLP (natural language processing).
Researchers also found that CheXpert did the processing in less than 10 seconds, compared with 20 minutes or several hours from NLP and that NLP radiology reports were the most frequent cause of errors within the ePNa decision support system.
“A 2013 study found that 59 percent of errors made by ePNa were a result of NLP processing of radiologist reports, so we’re eager to replace it with a better, faster system,” says Dean. The next step, he adds, will be for the CheXpert model to be used in select Intermountain ERs this fall.