NLP to help Mercy better treat heart failure cases

Study will look at how cardiac resynchronization therapy affects patient care and recovery.


Mercy is using natural language processing technology to extract key cardiology measures from electronic health records, clinician notes and other sources, then uses data from the measures to enable providers to make better treatment decisions for patients with heart failure.

NLP technology can take transcribed text, structure it into computable data and apply terminologies or codes for richer data abstraction and analysis. NLP text mining extracts information from text-based clinician notes and converts it into actionable insights that can be placed into a dataset and analyzed.

Mercy started to aggregate heart failure data in 2011 to identify patients with a cardiac resynchronization therapy device known as CRT, which is a type of defibrillator. By 2017, the organization had 35.5 million clinical notes from inpatient and outpatient encounters that were extracted, processed and loaded into a NLP server.



Mercy also entered into a partnership with CRT vendor Medtronic, which along with other vendors, want to know how their devices are performing. Medtronic is using EHR data to explore real-world factors that determine a patient's response to Cardiac Resynchronization Therapy.

Also See: App identifies patients needing advanced heart failure therapies

Now, having this data will help Mercy answer a key clinical question—how cardiac resynchronization therapy affects patients who have heart failure—says Kerry Bommarito, manager of data science and performance analyst for Mercy.

Data being used to support analytics includes patient demographics; medications prescribed in the hospital and after discharge in the home; lab tests; cardiac echo results; ejection fraction (the percentage of blood leaving the heart when it contracts and the percentage of blood that returns); procedures; CPT codes; and clinical notes.

Classifying the CRT data and analyzing it will help clinicians better understand how well a patient is able to handle everyday activities around the home following treatment, with the hope that health status measures improve, says Mark Dunham, director of data engineering and analytics.

Preparation work for the program was extensive and began with building queries to run over the clinical notes and analyze, then validating predictive values and false negatives, Bommarito recalls.

Now with more experience, query creation is deeper more precise, and delivers more insights. But the big lesson that was learned is just how much data an organization will have to sort though and select, she advises. “We didn’t know how many clinician notes we had and how large that volume was; we had 30 million notes just for heart failure patients.”

More for you

Loading data for hdm_tax_topic #better-outcomes...