Health systems are using predictive modeling to zero in on patients who may be most at risk for readmission or serious diseases. While not 100 percent accurate, these analytical tools are precise enough to help identify patients who most need additional care management. 

predictive analyticsSocial workers at Minneapolis-based Allina Health used to spend hours each day surfing through medical records and other documents trying to identify patients at risk for readmission. By arranging needed post-discharge services, the social workers hoped to reduce the patients’ unnecessary return to the hospital. 

Now, thanks to a predictive algorithm, these social workers have a list of high-risk patients right at their fingertips. Every inpatient at Allina’s 11 hospitals now gets assigned a readmission risk score. The score predicts the odds (high, moderate-high, moderate, or low) that each patient will be readmitted within 30 days. 

Until recently, few healthcare providers had enough data to statistically foretell a patient’s risk of readmission or likelihood of carrying serious diseases, such as methicillin-resistant Staphylococcus aureus. Now more and more healthcare organizations are compiling massive amounts of patient data via electronic health records (EHRs). Combined with predictive analytic techniques, this treasure trove of data can provide numerous foresights that can be harnessed to improve patient care and reduce costs. 

readmission risk

Modeling for Value

In Moneyball, Billy Beane (played by Brad Pitt in the movie adaptation) relies on a genius-minded statistician (a fictional character based on Paul DePodesta) to help pull together a winning baseball team with one of the stingiest payrolls in the major leagues. Beane embraced sabermetrics, a type of predictive analytics, to identify promising players who were undervalued in the Major Leagues. The rock-bottom Oakland Athletics ended up finishing first in the American League West.

Today, many healthcare organizations are in a similar spot as the Athletics in 2002. They are trying to dramatically improve performance at the same time their reimbursements are shrinking. Predictive analytic techniques, which range from basic statistical regression to sophisticated machine learning, can help healthcare organizations pinpoint how they can get the most bang for their buck in terms of improving quality and reducing costs. 

“We are trying to understand as early as possible how to get the right care to the right patient at the right time,” says Allina’s Michael Doyle, director, health care intelligence.

Predictive analytics have been used for decades by health insurers to identify potential high users of health services, as well as by the finance industry to calculate consumer credit scores. The growing interest in this type of business intelligence among healthcare providers correlates with the growing amount of financial risk they are assuming under health reform. With Medicare payments now tied in part to hospital readmissions and other performance-based metrics, providers are motivated to invest in reducing needless admissions and visits and costly adverse events. 

Reducing Unnecessary Utilization 

As one of the Pioneer Accountable Care Organizations, Allina Health is expected to provide high-quality care to Medicare patients for a fixed cost. The health system, which prides itself on its mission-based quality focus, has also assumed a financial imperative to prevent unnecessary hospitalizations and emergency department (ED) visits. 

Assigning inpatients a readmission risk score. To develop a readmissions predictive model, Allina’s senior statistician Jason Haupt, PhD, used data on about 200,000 inpatients, which are stored in the health system’s enterprise data warehouse. Using various modeling techniques, Haupt investigated the predictive value of hundreds of variables that might influence a patient’s readmission risk. The resulting algorithm employs 30 highly predictive variables, including patient utilization, medical history (e.g., history of diabetes), and various clinical data.

While not perfect, the Allina model has a moderate discrimination ability (C-statistic: 0.73). It tests well compared to readmission predictive models that are based primarily on insurance claims data. Because of its EHR, Allina can incorporate various types of clinical data that are collected on patients. 

“We can include, for example, a person’s weight, functional status (which our nurses measure), and medications. We found lab values, such as blood urea nitrogen, to have a lot of value all by themselves in predicting readmissions, and most readmission predictive models don’t include these values. We also found nursing assessments to be a very important data source, particularly related to social factors. The nursing assessment includes information on patients’ financial concerns, mobility concerns, and other things that are not captured in the claims data.”

Allina’s readmission algorithm sorts through EHR data on a daily basis, assigning readmission scores to all hospitalized patients. To make this information useful, it is uploaded onto the health system’s patient census dashboard, providing Allina staff with a one-stop place to obtain a list of inpatients with a high risk of readmission. (see the exhibit below). “This dashboard is available to users across our organization as a business intelligence tool,” says Doyle. 

allina dashboard

Informing primary care physicians. Like Allina, NorthShore University HealthSystem, Evanston, Ill., developed a readmissions predictive model based on a variety of clinical, utilization, and social variables. Daily reports are given to health system hospitalists that divide inpatients by high (~35 percent), medium (~16 percent), and low risk (~12 percent) for readmission in 30 days. 

“The hospitalists, social workers, and other staff found the reports so useful for discharge planning purposes that they asked if we could have the readmission score come up as a value in every patient record in the inpatient EHR,” says Jonathan Silverstein, MD, MS, vice president of clinical research informatics, NorthShore University HealthSystem, Evanston, Ill. “This was not trivial and involved pulling data from our enterprise data warehouse to a robot that is doing calculations and passing them back to the EHR. But now this is totally automated.” 

This success quickly led to another request. “Our medical group got excited, and asked if we could send a report to our primary care physicians that listed their patients with a high risk of being readmitted. Now we continue to keep patients on the readmissions report for 30 days after discharge, and we provide that report to the patients’ primary care physicians.”

NorthShore physicians, social workers, and other staff use these reports to help inform and coordinate care during and after the hospital transition. “What’s exciting is that the interventions we have implemented, such as making sure patients see their primary care physicians after they leave the hospital, seems to be improving our readmission profile to a measurable degree. We have seen a reduction from 35 percent to 28 percent in readmissions among patients who have a high risk of readmission.” 

Guiding improvement efforts. Allina is currently testing about 10 interventions for reducing readmissions. Some are focused on particular patient populations, such as an executive function screening for patients with impaired cognition. Other approaches are aimed at patients with varying degrees of risk for readmissions. “Very complex patients tend to require lots of resources, but we may be able to bend the curve for moderately complex patients without applying as many resources,” says Haupt. 

For instance, a few hundred patients with complex conditions have been assigned transition coaches who teach the patients how to self-manage their conditions. The coaching program—which is modeled after The Care Transitions ProgramSM developed by Eric Coleman, MD—has reduced readmissions by about 30 percent among Allina participants. 

In addition, Allina now holds a transitions conference for hospital patients who have a high risk for readmission (i.e., a readmission score of 20 percent or greater). The conference brings together all the caregivers and family members involved in the patient’s post-discharge care. “It’s really about making sure that the transition to home or the skilled nursing facility is robust and that patients have a scaffolding in place so they don’t fall into that readmission category,” says Haupt. 

Henry Ford Health System is using predictive analytics to zero in on congestive heart failure patients who might benefit from a combination of intensive case management and telemonitoring. “We use this clinical intelligence as a strategy for population management, to prioritize which patients we should outreach to first, second, and third,” says Cara Seguin, RN, MSN, director, Center of Clinical Care Design. 

Access related sidebar: Reducing CHF Hospitalizations at Henry Ford

Predicting Disease Risk 

Known for its leading-edge IT, NorthShore has a robust enterprise data warehouse, and its inpatient and outpatient EHRs have both earned HIMSS’ highest ranking. In recent years, the health system—which is the primary teaching affiliate for the University of Chicago’s Pritzker School of Medicine—formed the Center for Clinical and Research Informatics. Focused on clinical quality improvement and peer-reviewed research, the Center has about 50 projects going on at any one time. 

Hypertension. One of Silverstein’s favorite predictive modeling projects identifies patients who have hypertension—but don’t know they have it. The project was started by Michael Rakotz, MD, a family medicine physician at one of NorthShore’s owned practices, who was bothered by epidemiological research showing that approximately 25 percent of people with hypertension have not been diagnosed with the disease. “Many of the patients get regular medical care, but their hypertension has been overlooked,” explains Silverstein. “Perhaps, for example, their blood pressure was a little bit elevated once, but they blamed it on having just run up the stairs.” 

Rakotz wanted to know if patients in the NorthShore system had similar rates of undiagnosed hypertension. So Silverstein’s team developed a predictive algorithm for undiagnosed hypertension, using data on about 1 million outpatients in NorthShore’s enterprise data warehouse. The algorithm includes predictive variables based on the clinical literature (for example, three blood pressures over a certain amount in the past year), as well as statistical parts that are based on machine learning, multiple regression, and other modeling techniques, says Silverstein. 

Using the algorithm, Silverstein’s team was able to identify 1,586 outpatients who might have hypertension, but had not yet been diagnosed with it. “Through a lot of shoe leather, Mike went to the primary care physicians of all these patients and said, ‘Please bring these patients in and evaluate them.’” Using a sophisticated blood pressure machine to get a highly accurate reading, the physicians discovered that 188 of the 496 patients who returned immediately for testing—or about one-third—did indeed have hypertension.

ppv htn

Silverstein’s team further refined the hypertension algorithm to have a 50 percent predictive value and built it into the outpatient EHR so that physicians are alerted whenever a patient meets the criteria for potentially undiagnosed hypertension. “That’s been running now for more than six months and, the last time I checked, it was going off on about 100 patients per month and about 50 percent of those were diagnosed with hypertension.” 

MRSA. Another predictive model for methicillin-resistant Staphylococcus aureus (MRSA) is saving NorthShore about $500,000 a year. Before the predictive model was developed, NorthShore tested all newly admitted hospital patients to ensure they were not carriers of the highly infectious disease, which is impossible to spot in asymptomatic patients. 

“NorthShore had previously shown that if you test every patient for MRSA through a nasal swab, and you isolate those carriers for MRSA, then you can reduce the inpatient spread of MRSA and reduce very serious complications from these superbugs,” says Silverstein. “So we were testing a lot of people. The reason most organizations don’t test all inpatients is because of the cost. It was costing us more than $1 million a year, and much of that testing was unreimbursed because it was done on a hospital epidemiology basis.”

Thanks to the predictive model developed by NorthShore researcher Ari Robicsek, MD, and colleagues, NorthShore continues to keep MRSA under control— but is only swabbing half of newly admitted patients. Robicsek’s model zeros in on those patients who are potential carriers of MRSA based on variables such as whether the patient lives in a nursing home, has a feeding tube at admission, or has lung disease. 

ari robisek

Recognizing that not all hospitals have access to the sophisticated data that NorthShore has, Robicsek developed different versions of the MRSA predictive model. The most sophisticated model involves 27 variables, but simpler models require less variables (Robicsek, A., et al, “Electronic Prediction Rules for Methicillin-Resistant Staphylococcus aureus Colonization,” Infection Control and Hospital Epidemiology, vol. 32, no. 1, January 2011, pp. 9-19).

“NorthShore can use a very robust model, but we have data that others don’t have,” says Silverstein. “So we put together a lighter model requiring fewer data points so that it can be generalizable and brought to other healthcare institutions.”

Putting Together the Pieces

“Predictive modeling is about asking the right question, which is the hardest part,” says Silverstein. “The right question is one that is impactful and that has the potential to change clinical practice. Then once we have the question, we need to provide an answer that can be applied across the health system.”

Accomplishing these objectives boils down to three things, says Doyle: “You need the right people to think about predictive modeling and help develop the data infrastructure. You need the right processes to help disseminate what can be done with the data in the operational community, and you need the right technology to help make that happen in a way so people are willing to adopt it and make use of it.”

Aligning the expertise. Silverstein has assembled a critical mix of clinical and technical expertise at NorthShore’s Center for Clinical and Research Informatics. “The center has seven faculty who oversee specific areas of research, including neurology and medical genetics. “Each of these faculty has an orientation toward a clinical domain as well as informatics expertise,” says Silverstein. “The rest of the center staff are technical people. We have a couple open-source research programmers who build algorithms. We have about four statisticians who also handle programming. And we have 12 people who are embedded in health IT—seven are in EHR optimization and five are split between the data warehouse team and clinical analytics, or data reporting.”

With one of the nation’s fastest-growing hospital-based research programs, NorthShore requires more bandwidth than the typical health system looking to invest in predictive analytics and similar business intelligence.

 Allina has put together a similar team as NorthShore’s but on a smaller scale. The addition of Haupt’s statistician position has been key to carrying out predictive modeling at the Minneapolis-based system. Before coming to Allina, Haupt crunched and analyzed data at the CERN laboratory in Switzerland, which is the world’s largest particle physics lab. “I know health care has a lot of data, but it’s nothing compared to what I’ve seen. Some of the data sets I used involved 20,000 to 30,000 computers working all night long.” 

With a PhD in experimental high-energy particle physics, Haupt admits his qualifications exceed what is needed at healthcare organizations that want to pursue predictive analytics. “There are not too many people like me in health care,” says Haupt. “But there are plenty of people who have the necessary statistical knowledge and background. One of the things that is most useful is having the experience of working with a lot of data.” 

Ensuring data standards and storage. Both NorthShore and Allina have advanced EHRs and enterprise data warehouses, which means they already have access to a lot of data to use for predictive modeling and other types of business intelligence. “Many of the data fields in our medical records have been in existence for quite some time, and the data are in a centralized location where I can just query the database, get the variables, and run it through a statistical software,” says Haupt. “So there are a lot of low-hanging fruit opportunities. You can determine who is more likely to readmit just by the data that exists in the medical record.”

This offers two lessons: One is the need for data storage. “Due to collaboration in health care and the emergence of vendors in this area, it seems to be taking a lot less time to implement an enterprise data warehouse,” says Doyle. “It’s taking under a year now, as compared to two years just a few years ago.”

In addition, during the implementation of EHRs, organizations need to promote the capture of structured, clean data. “People miss the fact that the cleanliness of the data is paramount to modeling,” says Silverstein, “They start using data, such as billing data, that are built for completely different purposes than what their predictive model is focusing on. Remember, there’s a whole translation that occurs with the data. You have to figure out what the data really mean. What is the night watchman really doing? The night watchman may be a clinician, a respiratory therapist, or an administrative person. And you have to ask, ‘What is the process for entering these data? What are the data there for? How can I use the data in an effective way and not misuse the data?’”

Obtaining clean data often involves redesigning workflows to ensure that staff are electronically documenting key patient data, using point-and-click capabilities. “As you are implementing your EHR, you need to capture things as discretely as possible within the medical record,” says Doyle. 

Putting intelligence to work. The true measure of a successful predictive model is whether it provides the intelligence needed to improve quality and reduce costs. “Predictive modeling is useful when tied in a complete loop with the action arm of the organization,” says Silverstein. 

A user-friendly interface for reporting the predictive information can be helpful. Before creating its patient census dashboard, which displays readmission scores, Allina sought input from the potential users, including social workers and case managers. “We went through several iterations of the dashboard based on their feedback, and this was key in making the dashboard efficient for their workflow,” says Haupt. 

ruth akakulu

Allina also tapped a system-level champion—Karen Tomes, RN, director of quality improvement and care management, says Doyle. “One of the keys to success has been Karen’s involvement. She convened groups of staff to demonstrate the potential value of the tool, she spearheaded the development of educational materials, and she helped develop a work flow that uses the predictive model to help focus care management resources.”

Beginning with Good Enough

Perfectionists may have qualms with using predictive modeling to foretell readmissions or other health events. In fact, 2011 research by the U.S. Department of Veterans Affairs (VA) concluded that “most current readmission risk prediction models, whether designed for comparative or clinical purposes, perform poorly” (Kansagara, D. et al: Risk Prediction Models for Hospital Readmission: A Systematic Review, October 2011). 

However, the VA research primarily criticizes the use of these predictive models in public reporting efforts that compare hospital readmission rates—and, in Medicare’s case, financially penalize poorer performing hospitals. The VA researchers leave the door open to hospitals that want to use readmission risk scores to guide the allocation of resources for patients. In fact, a 2007 study found that even screening tools with a low positive predictive ability (i.e., 20 percent to 30 percent) could turn a case management program into a cost-effective program (Mukamel, D.B., et al, “Effect of Accurate Patient Screening on the Cost-Effectiveness of Case Management Programs,” Gerontologist, December 1997, vol. 37, no. 6, pp. 777-84). 

Both NorthShore’s and Allina’s readmission models have moderate predictive abilities, which Silverstein says is “good enough” for its purposes of helping pinpoint which patients are most in need of clinical interventions. 

This same anti-perfectionist approach can help resource-thin organizations that want to reap the reward of predictive modeling and other analytics. “You don’t necessarily have to build a robust EHR system that covers the entire landscape of electronic data,” says Silverstein. “Choose a few questions that are important to your organization. Then start collecting data around those questions. Laterally, you can choose a less comprehensive approach to data collection. But vertically, you have to manage the data comprehensively or you have nothing. So make sure the data are collected in a consistent manner, make sure you can analyze the data, and make sure you do something actionable with the data.”


Maggie Van Dyke is managing editor of Leadership (mvandyke@hfma.org). 

Interviewed for this article (in order of appearance): Michael Doyle is director, health care intelligence, Allina Health, Minneapolis (michael.doyle@allina.com). Jason Haupt, PhD, is senior statistician, Allina Health (jason.haupt@allina.com). Jonathan Silverstein, MD, MS, is vice president of clinical research informatics at NorthShore University HealthSystem, Evanston, Ill. (jsilverstein@northshore.org). Cara Seguin, RN, MSN, is director, Center of Clinical Care Design, Henry Ford Health System, Detroit (cseguin1@hfhs.org).

Publication Date: Friday, March 01, 2013