Radiology departments once were a lode of recurring revenue for healthcare organizations earning their dollars through billing volume.
Now, these same expensive investments in imaging systems, archives, image management software and radiology staff have to provide value to the enterprise as it shifts to containing costs while demonstrating clinical worth to performance-minded payers.
For radiologists, charged with reading and making decisions on the output of digital X-rays, CT scans, MRI studies and other modalities, the new cost-related demands will add to the existing challenges of reading scans for any possible abnormality, especially the small ones that can grow into big and costly medical problems if not caught early.
"The transition of radiologists to [being] a value-based provider means the tools that the radiologists use to create their product, which is the report, are necessarily going to change," says Kevin McEnery, MD, director of innovation in imaging informatics at University of Texas MD Anderson Cancer Center.
Among the technology tools that could change the process of image examination are analytical solutions that assist in interpreting images by finding suspicious markers they've been programmed to detect and evaluating whether they could be tumors or other concerns.
Known as CAD, short for either computer-assisted detection or diagnosis, this technology has been on the market for more than 10 years, becoming routine for mammography and lung cancer detection and spreading into areas such as coronary artery imaging and colonoscopy. "If they're used [for the right purposes], they can increase productivity, and diagnostic quality, and diagnostic yield," says David Mendelson, MD, chief of clinical informatics at Mount Sinai Hospital, New York.
With the widespread adoption of electronic medical record systems, and higher sophistication of informatics associated with imaging technology, interpretive assistance is coming from additional quarters, including:
* Intensive deployment of information in EMRs at the point of image reading, to automatically supply context on a patient's current and recent medical situation, the observations of the patient's doctors, and a better understanding of what the radiologist should be looking for and why.
* Communication of relevant information from previous scans, such as calculations and measurements, to the current image for immediate use in the reading and analysis.
Despite their promise, these computerized aids are tough calls for CIOs and other technology decision-makers because of an abundance of concern about their accuracy in both finding possible problem areas-their sensitivity-and identifying those areas as just what the radiologist is searching for-their specificity. Inaccuracies result in either false positive or false negative interpretation.
Also, Mendelson says there is hesitation because of the additional time required to run the image through a computer program. "There is a tension between an improved yield-finding things that you missed-and diminished overall productivity," he says.
Pressure in general to keep testing and treatment costs in check are hitting the radiology function in particular, creating a need to improve the value proposition of radiologists, McEnery says. As payers step up scrutiny of the medical necessity of procedures, the American College of Radiology has issued guidelines for optimal imaging services, which it defines as all the imaging care that is beneficial, while not conducting imaging that is not useful.
Imaging is beneficial when it provides information to either justify a medical intervention or wave it off, and the odds of getting it right depend on everything from noticing a minute density on the scan to breaking through the tedium of mainly negative reads on hundreds of image slices per day. CAD systems, which don't get tired or eye-weary, use algorithms to find areas of concern on images, lab slides and other views of human anatomy in digital form.
For example, after rigorous testing and Food and Drug Administration approval, systems were built that scan pathology slides for abnormal findings, says Mitchell Goldburgh, cloud imaging solutions manager at Dell. Pathologists see as many normal slides as they do abnormal, and they can only look at a certain number per hour, Goldburgh says. With the computer system indicating the true positives, professionals on staff can look mainly at those. "Now, the productivity of a cytotechnologist goes up." It's a similar situation for computer-aided tools in radiology, he says.
Interpretation assistance systems are a good fit in "comparative analysis between two different time frames," says Shaun Land, a consultant specializing in healthcare enterprise architecture issues, including uses of medical equipment, for Irvine Labs. "You can actually compare the [image] slices between last year and this year. And if there's any change in mass, even very small and minute, then it can help detect that."
In emergency situations, where minutes count in diagnosing-for example, in determining the cause of chest pain-a CAD system for coronary CT angiography does a virtual triage of cases, running specialized algorithms to determine whether a patient has significant artery blockage or not. Called COR Analyzer, the system developed by Haifa, Israel-based Rcadia identifies patients who need to be taken immediately to the cath lab for coronary angioplasty, speeding up a second read by a heart specialist and prompting quick action.
"The aim of this whole setup is to reduce the [number] of such cases to the very minimum," says Roman Goldenberg, MD, vice president of research and development for Rcadia. If the presence of disease is low for patients with chest pain, "then you can sift out a huge amount of such patients" who don't need costly intervention. Angiograms that are diagnosed as negative still can be read in due course by emergency department physicians, he emphasizes.
However, there's still a significant debate over how widely CAD can be used. Some say it can be used to conduct first reads of images, not just relegated to be a backup to a radiologist's review of a scan. In general, radiologists are at odds about the accuracy of current technology.
One source of that problem is the consequence of how the algorithms are calibrated: One end of the spectrum is high sensitivity to any possibility of something being abnormal; the other is fixed on highlighting only those points on the scan that can be specifically diagnosed as a positive finding.
The computer assist "has to be as good as a human," says David Avrin, MD, an interventional radiologist and vice chair of informatics at University of California San Francisco Medical Center. "Because if it's not as good as a human, to be safe, you have to refer an incredible number of studies for human second interpretation."
"It's a never-ending problem of trying to increase sensitivity and specificity in medical imaging, whether human or machine interpreter," Avrin says. "You're always going to have a certain number of false positives and false negatives."
In mammography, for instance, if adjusting so as not to miss detecting disease, "then you may be calling too many patients back for biopsies and additional views, and causing them psychological harm. Now if I err on the side of, 'Gee, I really don't want to bug this lady about coming back,' then I'm at risk of missing the cancer."
With an automated detection system, "Where are you going to set the threshold point? If you set it too low, you're going to miss things; if you set it too high, every study the computer reviews has to be reviewed by a human," Avrin says. "Unless you have that device equal to the false-negative/false-positive discriminatory capability of a very good mammographer, then you can't put that device into place."
Mount Sinai has CAD systems for mammography and chest CT scans, which are used "with mixed feelings," says Mendelson. They find true lesions, but in searching for any evidence of abnormalities, "they issue false positives-lesions that aren't real," he says. "The problem for radiologists is that [it] is time-consuming to filter out what's not real from what is real [and] which they might have missed.
"Make no mistake, I'm an advocate of these technologies, but I might be in the minority today," Mendelson adds. "There's a lot of human resistance to these things at the moment."
Mendelson himself doesn't see the time argument. "I work it into my workflow. But a lot of people have trouble doing that. And I'm not sure if their trouble is just change management or if it's really just hard for some people. I have a very efficient way; it adds a minute to my chest CT [reading]. So if I were reading 30 CTs a day, that would add 30 minutes. For the improvement in quality, I think that's worth the effort."
The strategy for enlisting computer-assisted image interpretation comes down to coping with its known shortcomings today while waiting for advances in accuracy and utility that many predict are on the way.
"Techniques improve," Mendelson says of the outlook for CAD. "There's a role for them today. There's probably going to be an increasing role. But we have to be patient and get there step by step-not have over-expectations, but not veto their use because they're not perfect."
McEnery says decision support to inform interpretation, supplied by the M.D. Anderson EMR, helps gain a context of the patient that is much richer than the summary, history and reason for the exam that radiologists usually have from the requisition. Besides providing additional clues about how to call toss-ups on the scan, it helps to "get a complete picture of the patient through the clinician's documentation."
That in turn enables radiologists to "create reports that are more pertinent to the patient's clinical presentation and the thinking of the [ordering] clinician." And with that extra insight into the case, such as a suspected lung cancer patient's smoking history and family or genomic factors, the report can include probabilities for what needs to be done next or-importantly for optimum image utilization-what doesn't need to be done, McEnery says.
M.D. Anderson also is in collaboration with IBM's Watson supercomputer to invent the next CAD model. Instead of mimicking the radiologist's process of segmenting an area for study and then deciding whether areas are normal or abnormal, Watson compares the medical records, negative and positive, of hundreds of thousands of patients and looks for patterns, then compares the computation with a patient's scan and makes recommendations based on that comparison, says McEnery.
The concept of machine learning that resources such as Watson can do is only beginning to be understood. "The way a CAD is done today, we won't even recognize it in a few years once the computerized machine learning tools get applied to that," McEnery predicts.
Avrin is sure, however, that machines won't replace imaging experts.
In the way radiologists analyze "with our eyes and with our brain," he says, "there is a computational intricacy that we haven't figured out yet" with computers. "I would say that the general abstract problem of approaching a trained radiologist's accuracy for presence or absence of disease and for differential diagnosis . . . is one of the most challenging areas of computer science."
TECH HELPING WITH DIAGNOSIS, TREATMENT
In an environment that values well-managed financial risk and efficiency, computer-assisted detection systems can help prioritize and distribute images for reading, discern the effect of cancer treatment in weeks rather than months, and improve the use of hospital services.
Johns Hopkins Hospital gets the most out of a radiology staff by using a CAD system to first flag urgent or complex CT images of heart disease that should go quickly to its most experienced readers, then distribute the rest of the workload to others according to relative experience and speed of their work, says Roman Goldenberg, MD, a vice president of Rcadia, which developed the technology.
In cancer treatment, which often hinges on the earliest evidence possible that a chemotherapy treatment is working or not, a computerized tumor assessment technology goes beyond monitoring the size and shape of a tumor over time, differentiating cells killed by the chemo and cells that are still viable, says Boaz Pal, senior product manager of the Philips IntelliSpace Portal, which in its just-released eighth version features this technology.
"You don't have to wait months to see if the treatment worked," says Pal. The technology is sensitive enough for clinicians to see the impact within weeks, either to support continued productive therapy or stop therapy at the first sign of ineffectiveness. That reduces unnecessary risks attendant to chemotherapy and enables clinicians to quickly switch to something else that might work better, he says.
These are examples of CAD as the first read of imaging studies, rather than the backup second read after a radiologist already has had a long look.
The Rcadia system for automatically determining whether a CT angiogram shows more than 50 percent artery blockage has impact beyond triaging chest pain, says Goldenberg. By better analyzing the gray area of patient symptoms, it can expedite the transfer of patients to lower-cost care settings in the hospital if the presence of disease is not significant enough to require angioplasty. For those identified as needing an invasive procedure, the hospital has an extra measure of justification for the necessity of the procedure and getting reimbursed, he says.
Register or login for access to this item and much more
All Health Data Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access