Like many hospitals, Advocate was struggling to reduce 30-day hospital readmissions, a key benchmark for Medicare reimbursement. An earlier risk-stratification model developed in-house at Advocate had yielded a model with little predictive power, a result that has been typical for attempts to use administrative data alone to classify patients. The model automated the process, identifying patients deemed at high risk of readmission and embedding the information within the EHR. The first iteration of the model yielded only modest predictive value. But after only a year of use, readmissions from all causes had dropped by 20 percent among the highest-risk patients in the Advocate system when comparing outcomes in the first half of 2013 with the same period in 2014. Bharat Sutariya, M.D., chief medical officer and vice president of population health for Cerner, notes that Northern Arizona Healthcare, a Cerner client hospital, has reduced its readmission rate by more than 40 percent since implementing the model in mid-2014.
Developers acknowledge that the model is only as good as the data available to it, and Cerner’s model is still missing some relevant data, such as socioeconomic factors, which may limit its predictive ability.
Further, Sutariya says that being able to identify a person as high-risk doesn’t always mean that you can mitigate that risk.
The team is working on stratifying groups of patients into clusters and targeting those for whom intervention is possible and may make a difference in outcome. For instance, the team is implementing a predictive tool to identify patients who could benefit most from having care managers to coordinate care, Sutariya says. He counsels prospective clients to think about using predictive analytics in clinical areas where changes are feasible and can make the biggest impact.
Making analytics predictive
True predictive analytics differs from bread-and-butter statistical methods such as logistic regression models that deal in probabilities. It doesn’t require human intervention to identify relevant variables or formulate a hypothesis. Often, predictive models use machine-learning techniques that find patterns in data where no one thought to look.
Mercy, the seventh largest Catholic health care system in the United States, includes 45 acute care and specialty hospitals, primarily in the Midwest. Mercy’s leadership team had already committed to develop care pathways that deliver optimal care across all of the system's clinical operations. As early adopters of EHRs, they had many years of historical data but found that traditional descriptive analytics was not providing solutions for what they saw as unacceptable variation across member hospitals.
“We found out that descriptive analytics will tell us how we’ve done in kind of a rearview mirror look, but we need to find ways to predict into the future,” says Vance Moore, senior vice president of operations at Mercy.
Mercy, he says, wants to use analytics to make meaningful improvements on both the clinical and operational sides of its business. “We are trying to predict the future and stage human and physical assets near their point of need so we can service the customer much better, and we become much more efficient in doing so, as well,” he says. Moore gives the example of the common hospital practice of waiting until a patient is ready to be discharged to find a wheelchair when it should be possible to anticipate that need well in advance.
To meet its goals, Mercy partnered with Ayasdi, an analytics company that combines machine learning with algorithms that generate geometric patterns in big data. The visual aspect of the search structure helps analysts to identify data that form meaningful clusters and that can be further investigated. In an early use case, Mercy wanted to understand the factors that influence hospital length of stay after joint replacement surgery.
Searching across Mercy’s network, the Ayasdi algorithms identified a cluster of patients with shorter length of stay. Looking more closely, the analysts identified a group of providers that was using pregabalin, a neuropathic pain reliever, in the acute aftermath of surgery. The patients who received this treatment used less opioid pain reliever and were ambulatory more quickly than other patients.
“It would have been a much longer process to have arrived at this finding any other way,” says Sangeeta Chakraborty, chief customer officer at Ayasdi.
Mercy has now implemented an automated system of 84 care paths covering 80 percent of the care delivered within its system using Ayasdi algorithms. Data from each patient encounter are evaluated in real time, and as new technologies are introduced, the software constantly measures whether the technology is making a positive difference for patients. The plan, Moore says, is to measure the difference between target and reality for each care path and make changes as the data warrants, so that what works in terms of cost, quality and efficiency rises to the top.
Doctors are fully informed about why any change of care is being considered and have input into any changes, says Todd Stewart, M.D., vice president of clinical integrated solutions at Mercy.
But health care analytics vendors emphasize that hospital systems must be open to going where the data take them, even if the answers defy expectations. “Our mission is to get into the hands of doctors all of these findings that are coming out of these reams of data,” says Chakraborty. “Then it is up to the hospital system to decide the best way to make change happen.”
Homegrown analytics solutions
Some health systems have opted to develop their own enterprise systems from within by hiring data scientists and developing their own predictive analytics systems.
The University of Pennsylvania Health System has invested heavily in a centralized data warehouse and development of machine-learning models that can make forecasts and then push the results back out to Penn’s EHR system.
Chief data scientist Michael Draugelis is its systems architect. Coming from a data-analysis career in the missile defense industry, Draugelis was well-versed in delivering predictive analysis under tight deadlines. When he was hired 18 months ago, he promised to deliver a working model in six weeks and to provide information of value every month.
“Typically, when going though this [process] for the first time, there is not an appetite to say, ‘We’ll spend seven months of really expensive people to come out with a failure,’” says Draugelis. “So we are picking projects where there is immediate value, and we know we can capture some.”
Penn’s early projects have centered on spotting patients at high risk for heart failure and sepsis. Until the machine-learning model was implemented, only heart and vascular service line patients were being evaluated for heart failure, and the system was only capturing about half the patients at risk, says Draugelis. Now, all patients deemed at high risk through the machine-learning algorithm are flagged for follow-up, and the hospital is piloting a project to connect high-risk patients with home care.
“Really, the value is in the data,” Draugelis says. “With a small team of developers and data scientists, you can build a system. What’s happening in the marketplace right now is there are a lot of vendor solutions out there, but I’m not sure the market is valuing the right thing right now.”
Penn plans to scale up its model, named Penn Signals, and publish it as an open source solution for others to use in midyear. “What I want to do is make the technology freely available, and I think that’s a first step to lowering the barrier to making these solutions available,” he says.
Draugelis points out that any institution that wanted to deploy his open-source system also would have to invest in the data scientists who know how to use its capabilities.
For health systems with more modest resources, Seattle-based Tableau Software offers a data-visualization system that can import and combine data from myriad sources and display them in a visually intuitive dashboard. Given its origins as a collaborative visualization project by then-Stanford graduate student Chris Stolte and his adviser Pat Hanrahan, founding member of the movie animation company Pixar, the software is flashy, with lots of visualization options. Its biggest strength may be its ability to import data from disparate sources and combine them in an intuitive dashboard for data novices. That ease of use may account for its explosive growth. In a 2014 HIMSS survey, Tableau was reported to be the most commonly used data-visualization software.
Andy Dé, Tableau's managing director for health care and life sciences, says the beauty of the software is that it allows doctors, nurses and other practitioners to ask and answer their own questions. The company is so confident that users will see value in the software that it offers a free 14-day trial and says the system can be deployed across an organization in weeks, not months.
While Tableau works with traditional spreadsheet-style data, it has the flexibility to incorporate much more sophisticated back-end predictive analysis, says Dé. For instance, Tableau software can be used to produce visualizations based on its predictive analytics tools.
Rajib Ghosh, chief data and transformation officer at Community Health Center Network in central California, says Tableau helps his organization’s eight community health centers manage the attributed population of about 200,000 within their accountable care organization. For two years, they have been using the system to combine financial, EHR and pharmacy data to understand how people were accessing services; now, they are working to predict how demographic shifts will affect utilization. During this early phase of rollout, the organization has focused on putting a data-governance structure in place and making sure data definitions are uniform across the organization.
Ghosh says that for his organization, having business analysts on staff who can look at the data critically to solve priority issues is crucial. “Tableau, data; those are tools,” he says. “These are the means to an end, they are not an end in itself. If you don’t have a person who can do that and you just have some IT developers, that’s not going to cut it.”
Tips for Leaders Looking to Implement Analytics
• Uniform, high-quality data are essential for any analytical results to be trustworthy.
• Internal governance processes and structures must be organized to allow integration of disparate data sources: clinical, operational and financial.
• Predictive analytics is only effective if stakeholders buy into using the information.
• The chief analytics officer is a new C-suite position being created as organizations realize that analytics can be a competitive differentiator.
• Strategic analytics leadership matters. Hospital systems with a CAO on the executive team have greater analytics maturity than organizations in which the CAO is not on the executive team.
• Executive leadership at companies with the highest analytics maturity place high importance on the use of data throughout the organization. They also recognize their organizations are not as effective as they need to be in using data.
Sources: “The State of Analytics Maturity for Healthcare Providers,” Feb. 24, 2014, survey report of HIMSS and the International Institute for Analytics; “Data Needed for Systematically Improving Healthcare,” July 31, 2015, report of the National Quality Forum.