The first time Kaiser Permanente Northwest launched an analytic tool to predict hospital readmission risk, the initiative flopped. The integrated health system wanted a tool that would identify high-risk patients before they were discharged so clinicians could arrange needed follow-up care to prevent complications.
“We had been using a subjective assessment or the physician’s gestalt on whether they thought the patient was at low, medium or high risk for readmitting,” said Delilah Moore, PhD, pharmacy analytics manager. “We were looking for a more objective measure.”
Kaiser rolled out the LACE model, a validated index that predicts 30-day readmission risk based on length of stay (L), acuity of admission (A), pre-existing comorbidities (C), and emergency department (E) visits. Patients’ LACE scores were shared via an Excel spreadsheet. But staff didn’t know what to do with the information, and physicians complained the score was inaccurate.
“We went back to the drawing board and said, ‘Let’s involve the right stakeholders,’” Moore said. After numerous rounds of complex analytical calculations, Kaiser launched an improved LACE model in July 2014 that has gained physician confidence and is integrated into its electronic health record. Now, Kaiser Northwest is weighing whether the model is good enough or if a new rendition is needed.
This continuous improvement approach reflects the experience of other healthcare organizations trying to use analytics to inform readmission efforts since Medicare started penalizing hospitals with higher-than-average rates for certain conditions. Progress is being made: Avoidable readmissions have declined in almost every state since 2010, but there is dramatic variation across states, from declines as small as 0.7 percent (Oregon) to as high as 13.4 percent (Hawaii).
Hospital leaders working on this issue point to lessons they’ve learned about using analytics in readmission reduction efforts.
There is no shortage of readmission predictive models. A 2016 BMJ study identified 73 models in the literature, from commercial to homegrown indexes, including the Readmissions Risk Score, PARR-30, and HOSPITAL score. However, the predictive accuracy of models varied from poor to high (C-statistics 0.21 to 0.88), and models performed inconsistently across studies.
To address validity issues, Kaiser Northwest intricately involved physicians in customizing LACE for its Epic EHR and patient population, Moore said. Only one of the LACE variables (length of stay) was easily pulled from a discrete EHR field. For the other three variables, analytic staff had to pull data from various places and employ algorithms to calculate the LACE score of 0 to 19, with 11 indicating high readmission risk.
For example, to determine “C” or comorbidities, analytic staff had to look at the patient’s problem list and encounter diagnoses in the EHR. “We ran that through an algorithm which basically spit out a score per health record number, which was then fed back into Epic.”
Kaiser physicians then reviewed charts to see if a patient’s LACE score correlated with what they saw in the record. Physicians would note, for example, if a patient had type 2 diabetes and heart failure, but the LACE score missed the diabetes diagnosis.
“We went back and forth with the clinicians to make sure we were pointing to the right place in the record and that we were using the right diagnosis codes,” Moore said. “They would say, ‘Well, a more accurate way to get that [information] is to look here and here, not just this one place.’ ”
When El Camino Hospital, Mountain View, Calif., switched to an Epic EHR in November 2015, the 395-bed medical center also changed predictive readmission models. A staff physician with analytics experience developed El Camino’s first model in 2010, which was a great start, said CNO Cheryl Reinking, RN. But the tool was showing its age. For instance, some data elements had to be manually entered by nurses.
Instead of investing resources to develop a new index, El Camino adopted the LACE+ tool, which was built into the hospital Epic EHR foundation system. LACE+ adds covariates to the LACE model, including age, case-mix index and number of urgent admissions. El Camino IT staff programmed the LACE+ score to appear in a visible spot in patient records and post-acute orders. But the organization did not customize the tool like Kaiser did.
From Reinking’s perspective, what matters most is that clinicians have a tool to help “scale our interventions,” based on a patient’s risk of complications and readmission. For instance, high-risk patients who transfer to skilled nursing facilities are visited by case managers who consult with facility staff on what’s best for the patient. Intensive interventions like this have helped reduce high-risk readmissions by around 30 percent.
“The cost of healthcare being what it is, we have to use our resources wisely,” she said. “Not every patient can get intensive follow up, and this kind of analytics helps us focus our resources.”
Reinking also believes clinicians’ gut feelings need to be considered. Many factors can impact readmission risks that are not included in LACE+, including medication adherence and social determinants of health. “When you are an experienced clinician, you know when someone is at risk. You can see the patient’s physiological and social problems.”
For this reason, El Camino clinicians can refer any patient, even those with low LACE+ scores, for intensive follow-up after discharge.
Access to the right mix of data is key to accurately predicting readmissions, according to Ashish Atreja, MD, chief technology innovation and engagement officer in medicine at the Icahn School of Medicine, Mount Sinai Hospital.
One common missing link is unstructured data. “Some people say up to 75 percent of meaningful health data is unstructured,” Atreja said. “For example, if someone lives alone or is having financial problems, we don’t have that in our structured data but it might be mentioned in the physician’s notes.”
Another missing link is real-time data on how patients are doing after they are discharged, which is also key to succeeding at population health management, Atreja said. “Of the 5,000 heart failure patients we saw last year, how many are at a critical level and need to be managed closely?”
A Mount Sinai pilot with 60 heart failure patients is designed to get past these two challenges.
An analytics tool from CloudMedx, equipped with natural language processing and machine learning capabilities, is pulling key unstructured and structured data in Mount Sinai’s EHR into a predictive algorithm that continually calculates each patient’s readmission risk.
In addition, each of the 60 patients received digitally connected monitors and scales when they were discharged from the hospital. They have been asked to regularly report key symptoms, such as shortness of breath, as well as document their medication compliance. All of this additional data—blood pressure readings, weight changes, and patient-reported outcomes—are continually incorporated into the predictive model.
“We want to see if using real-time data like blood pressure readings, as well as socioeconomic data, can help us better predict patients who are going to be readmitted,” Atreja said.
Many of the predictive readmission models that hospitals currently use, such as LACE, are rule-based models based on a small set of data variables rather than machine learning models that can rapidly analyze a massive variety and volume of data, according to Ankur M. Teredesai, executive director of the Center for Data Science at the University of Washington Tacoma.
Teredesai and his UW Tacoma team built one of the first machine learning predictive models for readmissions in collaboration with MultiCare Health System. The first rendition of the model—called the Risk-O-Meter—focused on heart failure patients, calculating their readmission risk from a diverse set of information pulled from the EHR, claims data, administrative and financial systems, as well as patient-reported data.
Three years later, the readmissions index is just one predictive model in the Kensci Risk Management Platform for Healthcare, which includes more than 120 machine learning models that forecast the likelihood of numerous health risks, from dying and disease progression to the development of a chronic disease or suffering an adverse event after surgery.
The different machine learning models help inform each other, Teredesai explains. “We combine the output of all these models and decide what the patient’s risk of readmission is likely to be. We take the output of the mortality models and the length of stay predictors. We look at the disease progression models. We also take into account psychosocial factors, like if the patient has access to transportation or a caregiver at home.”
The platform is built on the Microsoft Azure cloud, which is HIPAA compliant. A cloud-based analytics system is ideal for large-scale, rapid-fire data analytics, said Sunny Neogi, head of marketing and growth.
In addition to giving patients a readmission risk score, the Kensci 30-day readmissions tool recommends specific interventions that clinicians can implement to reduce readmissions for each patient. The recommendations vary from patient to patient, depending on their risk factors.
For instance, the U.S. Army Medical Corps is using the Kensci tool to reduce readmissions among patients with heart conditions. When the tool identifies a patient who lives alone, it recommends that a tele-nurse be assigned to call the patient after discharge with reminders to take medications and eat well.
While commercial machine learning models can be expensive, hospitals typically see a return on investment by preventing approximately 50 readmissions, Neogi said. For instance, U.S. Army Medical Corps is on track to save tens of millions of dollars.
Retrospectively looking into why readmissions occur can also help prevent future returns to the hospital. At El Camino, an integrated care team meets every Friday and reviews all Medicare readmissions, explains Reinking. “We look at, ‘Was this patient's re-admission avoidable? If it was avoidable, what was the reason? Was it a process failure? Was it a socioeconomic issue that we didn't see or should have addressed with more resources?’ ”
As a result of these Friday sessions, El Camino found that many avoidable readmissions are related to end-of-life decision making. “We have a multicultural patient population, and end-of-life issues are addressed quite differently in different cultures. We're trying to address this. For instance, we've created a Chinese ambassador end-of-life program for our Asian population.”
Ultimately, Teredasai believes that predicting and reducing readmissions needs to be done in correlation with other important goals. “Readmission is a very important secondary metric, but analytics need to be used to help providers solve more basic problems, like predicting adverse events after surgery, reducing mortality rates or predicting disease progressions of patients,” he said.
Consider, for example, a heart failure patient who is admitted to the hospital. “The nurses and care managers are very focused on making sure this patient doesn’t readmit,” Teredasai said. “But they care more about making sure the patient doesn’t die.”
Register or login for access to this item and much more
All Health Data Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access