It seems straightforward enough: Start with billions of bits of data from those electronic records installed at great expense; divert and load the data as needed into formulas that calculate performance; report results to the federal government or commercial payers; get paid based on quality rather than quantity of service. Now, plow quality-metrics information back into clinical operations to guide improvement, and repeat.
Quality metrics, the mechanics of gathering the data elements, and the results underpin the entire premise that health care can be redesigned on a pay-for-performance model.
However, quality-metrics reporting in many cases is a clunky, labor-intensive exercise in digging through paper charts or electronic health records in search of the myriad data elements — an aspirin dispensed, a test performed, an exception to the rule noted — that have to be included to line up a true result. There is tremendous pressure from payers, business and patients for more immediate access to quality metrics and performance results. A value-driven delivery model and pay for performance demand that information is collected and analyzed almost instantaneously. The solution: Perform measurement electronically.
That's somewhat easier said than done. While EHRs and other health IT applications have certainly made it easier for providers to collect and comb through mountains of data, there is still a significant disconnect between the way quality measures are written and how EHRs work. Quality measures in place today were largely written before the dawn of health care's digital revolution. The measures are built for the human brain, not the processors running inside of that expensive EHR.
"Many of the performance measures were not developed with the electronic record in mind," says Helen Burstin, senior vice president for performance measures with the National Quality Forum. The current approach involves "taking what we have and retooling it, as opposed to really coming up with new measures that truly take advantage of the high-quality data within an EHR."
Whatever their benefit in amassing good data, computers lack basic human smarts. "Measures were developed with the thought that someone with clinical knowledge would be looking at the medical record and pulling out the relevant information," says Nancy Foster, vice president for quality and patient safety at the American Hospital Association. "So it didn't matter whether Dr. Jones wrote something one way and Dr. Smith wrote it another way as long as someone with clinical knowledge understood that what they were writing actually meant the same thing."
Measures dependent on this human analysis "are not easily and perhaps never can be fully translated into measures that can be abstracted from [EHRs]," Foster adds.
New measures, new problems
Spurred by the federal government, measure developers under the guidance of the NQF are barreling down a new path: creating quality metrics from scratch, using computer logic and digital collection modes. That's a step forward, but it creates some new problems. "When you try to automate something, you have to be very, very specific about exactly what every term means and how you represent that," says Pamela McNutt, senior vice president and chief information officer of Methodist Health System, Dallas.
The problems getting IT systems to turn out quality metrics that reflect what's actually going on vex information officers enough that the College of Healthcare Information Management Executives had this to say in late August in response to a federal request for information on the subject: "The accurate reporting of quality measures is one of the most daunting challenges faced by providers today."
And it is not just a hospital-specific issue, says Dana Sellers, CEO of consulting firm Encore Health Resources. This problem stretches across the entire care continuum, affecting everything from contracts to approaches to care to specifics of capturing information.
Unfortunately, that's a challenge organizations typically don't address when talking about how to build out quality metrics, says Ron Paulus, CEO of Mission Health System, Asheville, N.C., and a longtime performance-improvement expert.
"It's easy to put this stuff at the 80,000-foot level and say, 'OK, we'll align incentives and get people to do this,' but it's a completely different world when you're actually trying to create that transformation," Paulus says. A cascade of impediments stands in the way,
Glitches and gaps
Converting measures to electronic form will require "teaching the system to be intelligent," able to do "what the humans were doing," says Lisa Taylor, an information management expert at the American Health Information Management Association. To read and handle e-measures, systems will need a lot of help, technological and otherwise. "We can't forget that we're not just trying to change paper into electronics, we're also trying to automate what was a very complex thought process," notes Maggie Lohnes, health care principal at the MITRE Corp., a nonprofit technology development firm doing work solely for the federal government.
First of all, data have to be represented in standard ways so the computer knows what it's looking for and can find it every time. Not only does the computer logic have to be unambiguous, but the clinical vocabulary and values of a measure — what exactly defines a diabetic, for example — and choice of codes to represent information always have to be the same. This will ensure that the data are accurate and can be used to make comparisons, says Lohnes. The NQF, which supervises and endorses the work of measure developers on behalf of the federal government, is rolling out a model of specifications for them to follow, but settling on and executing guidelines will take time, she says.
If those problems were to be solved today, it would set up hospitals and physician offices to use computers for measure reporting — but not necessarily successfully. Loading data into a system so it can be reported later still would be a major problem. Information officers run the measure calculations and come up with obviously underreported results — sometimes zeroes — because the data either weren't there to retrieve or were tucked in a place the system wasn't told to check or couldn't access. A big hurdle is unstructured data, primarily physician and nurse notes in narrative form, containing a lot of irretrievable information that measures require.
Even if all obstacles to reporting information were cleared away, today's information systems typically are not set up to feed that information back quickly to the clinicians producing the results, to analyze the problems that the measures turn up and to facilitate actions that improve quality and safety. The ultimate purpose of measuring performance is not just to report it, but also to act on it, and that's the key to improving results that then are rewarded in reform-era contracts, says Linda Lockwood, R.N., Encore Health Resources associate partner who's leading its clinical advisory practice.
Today, performance data largely look backward — how well clinicians adhered to protocols during the prior week or months. Charles Christian, CIO at Good Samaritan Hospital in Vincennes, Ind., envisions a time when actionable data make it to clinicians in a far more timely fashion. "In the world to come, it's how we're doing in real time," he says. "The goal is to have a positive outcome while the patient's here … not looking in the rearview mirror" at whether a necessary medical action was overlooked or a critical intervention should have been done.
What providers can control
Persistent problems with the retooled measures and the lagging pace of tested and valid e-measures foster a just-get-by attitude, experts say, as health care providers go through an attestation that their systems technically can produce numerators and denominators of core measures as part of the meaningful use requirements of the HITECH incentive program.
Early in the program, says Encore's Lockwood, health care providers that engaged her firm embraced the concept of using the meaningful use blueprint for quality improvement as a chance to implement the technology, processes and workflow changes that would prepare them to be rewarded for performance. But, more recently, the perspective has switched to attesting successfully for the money. There may be zeroes in their formulas, Lockwood says, but they heard that it is OK to report it that way; therefore, the important thing is to check the submission so it gets through, not trace back to the underlying reporting flaws.
But "people are going to very quickly understand this is not something they can rush past," Sellers says. "They have to capture this data, and they really have to be able to use the data to improve performance." It's not just for a government requirement, either. One hospital client, for example, has signed a contract with a commercial payer with a $3 million bonus tied to meeting 11 quality metrics.
"Nine of those metrics were directly from the meaningful use quality measures," Sellers says.
Susan Kiley, an Encore consultant, dealt with one commercial payer whose instructions for a local contract were coming from the central office. "They're going to have the same metrics nationwide, and many of those metrics are going to be meaningful use," she says. "Commercial payers are going to get to that quicker than the government is."
Though a lot of the problems with the measures are out of providers' control, there are plenty of reporting steps in their own operations that only they can set right. Documentation accuracy and availability in IT systems and elsewhere will enter into reimbursement soon, Christian says. "My concern is [that] if we don't do a really good job getting those database elements in the appropriate places, and fix the workflows so that data can be entered … we'll fall out [of compliance with a measure] and our reimbursement will be decreased. It's not because we didn't provide the quality of care, it's just the fact [that] we didn't document that quality of care in the right place."
A hybrid solution
E-measures or not, the challenge of getting to data that are unstructured or from an outside electronic feed will complicate measure reporting for years to come, experts say. Despite significant EHR adoption, most clinical IT systems are a hybrid of structured data in records, unstructured input such as transcribed reports, claims-based data and various other data sources, says AHIMA's Taylor.
Some ambulatory measures, for example, may rely on pharmacy claims, which are not within the usual domain of an EHR, Burstin points out. Narrative clinical notes, says the AHA's Foster, often hold key data — for example, contraindications of medications that a metric might require to be taken. Ferreting out such exceptions is important to prevent a provider from being penalized for clinically sound actions. In addition, lab findings sometimes are needed to confirm diagnoses that weigh on whether to do something required in a measure. "Not every EHR accepts input from every lab database or radiology database, so you may not have a way to abstract relevant information about a particular patient simply from your EHR," Foster says.
The business reality for health care providers is that the measures have to be pulled together somehow, and although EHRs can greatly shortcut the process, pure automation is not the goal. Some elements "can best be derived from claims data, some information can only be derived by having a knowledgeable clinician look [at records individually], and, hopefully, the bulk of it can come from good specifications that allow us to pull it from the EHR," Foster says.
In the two years since Paulus became CEO of Mission Health, all those methods plus custom and commercially bought analytical IT tools have been marshaled for an all-out campaign of performance improvement [See interview, page 27.]. "We're jerry-rigging our way through our journey just like everybody else," he says. The first phase in a program of comprehensive performance assessment is to list the gaps in the information equation and go about plugging them. "You can embed things in the EHR once you know what it is that you need," he says.
What Mission and others following this strategy end up with are current measures and a means to display them prominently so the right people can be aware of how the hospital is doing all the time.
Getting to real-time reporting
To find gaps in care processes and documentation, Lockwood suggests taking a look at measures used for meaningful use attestation. One of the more difficult measures to document is treatment to prevent venous thromboembolism, or blood clots in the arms, legs or lungs. Looking at the reports generated by the EHRs, "very quickly you can see that there might be zero for the use of a protocol for VTE prophylaxis … but you can see that they had 40 VTE patients."
What often happens is that the organization did not implement the order sets for a care protocol and the evidence-based activities associated with it, or did it in a way that couldn't capture the data, Lockwood says. The information wasn't discrete, or there wasn't a "hard stop" in the order process so something had to be selected or documented to go on. Or if a doctor did not use the protocol, there was no provision to require an explanation of why. "The organization hasn't gone to the depth to set the foundation of processes and workflow tweaks to get and use elements of the metrics."
From a technical standpoint, that internal assessment and infrastructure work might take only three weeks or so, but succeeding at it requires a far bigger and longer cultural commitment. "You're going to have to test it, but you're also going to have to train, you're going to have to stand by your clinicians, you're going to have to message why this is important," Lockwood says, "and not because the government's giving us money, but because this is an overall patient outcome experience, and that's the way your organization looks at it." [See sidebar, page 28.]
That training and technical work has been completed at more than half of the 50 hospitals in Tenet Healthcare Corp., concurrent with the rollout of computerized provider order entry capabilities.
"When you have payer data, there's always a lag. And a payer's data are usually claims, and that's not usually right. These are real clinical data pulled directly from those places in the EHR," Lockwood says. "Smoking data are pulled from the nursing history; medication [data are] pulled from ordered medication; a reason why you wouldn't do a test is, hopefully, pulled from a physician note or an order attached to an order set."
After initial implementation, follow-up meetings assess the scores and what didn't seem right. Tenet managers could investigate, for instance, whether clinicians were entering the data as they should. The system allows them to check by unit or doctor, get the targeted clinicians to better understand the workflow around a care protocol and its documentation, and retrain.
Investigations sometimes turn up measure noncompliance that no one could have foreseen. One client found that compliance with using the problem list was extremely low, and especially in a unit for newborn babies, Sellers says. It turned out to be a misunderstanding based on a literal approach to the problem list.
"The staff were thinking, 'These are healthy babies, why do I have to record a problem?' Really, all they needed to do was click 'no problem' on a healthy baby and then they would be compliant," she says. "They had to record something on the problem list." Once that was ironed out, compliance numbers went way up.
John Morrissey is a freelance writer in Mount Prospect, Ill.
A CEO on a Mission
Ron Paulus has exemplified clinical performance improvement throughout his long health care career, so it's no surprise that the integrated delivery system that brought him in as CEO two years ago soon would reflect its leader.
Collecting and reporting quality metrics was but one of a slew of projects instituted at Mission Health, a five-hospital system based in Asheville, N.C., to achieve four aims: getting the desired outcome, without harm, without waste, and with an exceptional experience for patients and their families. Success at these objectives, says Paulus, played "no small part" in Mission Health's cracking the nation's top 15 health care systems this year in an annual analysis by Thomson Reuters.
Investments in performance improvement included identifying processes to repair and continually track to improve both the goals involving the usual industry core measures and other measures deemed a high priority for the health system's population, which is 75 percent Medicare and Medicaid. Patient satisfaction analysis was carried out to the nth degree; patient reps and top execs roamed the floors.
"It's some time and money, but the real value is what you can get out of it," says Paulus, who brought his experience as chief technology and innovation officer of Geisinger Health System, Danville, Pa. "It's altering the core product of our delivery, which is efficiency and outcomes of care. What we're seeing is we're moving these measures in a pretty significant way. I can show you before-and-after graphs of all these things that would knock your socks off."
Mission Health has mobilized a range of tools to find, lift and scrutinize clinical data. One of the systems deployed, which analyzes possible clinical shortcomings to zero in on, originally was developed in the 1990s by CareScience, a company he co-founded with David Brailer, who went on to become the first head of the Office of the National Coordinator for Health Information Technology.
A special project to detect and treat delirium called for "quasi-manual data collection," Paulus says. Some of that data was bundled into an EHR flow sheet and became part of nursing documentation. "Ultimately we get it concurrently and electronically, but it's still because somebody's capturing it."
There's a publicly displayed quality board in every unit of the health system. "On these boards are the most recent data for that unit as well as the overall hospital performance data. These are very prominent so that patients and their families and docs and nurses, everybody can see them."
Weekly leadership rounds take Paulus and other executives to individual hospital units to determine what works, what needs improvement and the goals to work toward. In addition, 100 patients are embedded into a facility-redesign and master-planning process, and a consultancy called ExperiaHealth follows patients around during their stay to discern how they see their care unfolding. Including patients "changes the dialogue. If you have a patient sitting there side by side with the doctor and the nurses and an administrator and the pharmacist and we're saying, 'How do we optimize this process?' nobody can say, 'That's really just too inconvenient' when the patient's sitting right there."
Capturing clinical information remains a chief challenge. Two prime targets for innovation are pharmacy data for medication reconciliation and turning unstructured data from notes into usable elements. Both use mid-level skilled staff to their fullest instead of roping in doctors to get basic data collected.
In the emergency department, pharmacy techs track documentation on medications. "It doesn't make sense for docs to be doing all this work. They should only do the stuff that's commensurate with their license, training and economic earning level," Paulus says. Mission Health is piloting a link to the e-prescribing exchange Surescripts "to provide 360-degree observation of true filled data as well as order data for med-rec in the ED." The feed will greatly cut into the manual work now necessary to get that information.
The missing link in converting unstructured notes into collectible data could be all the transcriptionists being phased out of their original job description by the rise of voice-recognition-to-text, says Paulus. "They're very familiar with medical terminology, they're familiar with rudimentary components of coding and abstracting. As we automate away that core transcription model, rather than saying we're just going to cut everybody and send them home and lose jobs, how can we transform their work into a more value-added component? They're naturally positioned to be able to take that automation-generated text and glean out of that some of these key data elements." — John Morrissey
When Doctors Balk
A successful record on performance will depend on how well doctors, nurses and others electronically verify that they did what they should have done, or why a patient's condition or circumstances justified not doing what the quality metric advised. The challenge for IT and operating officers is to work this documentation into the clinical workflow without overburdening or distracting providers.
Physicians are pushing back against additional documentation requirements. "Honestly, doctors are in a pinch for their time … because time is money for them and, with reduced reimbursement, you can't ask them to add another 15 minutes onto their documentation of a case, because it's stopping them from going to the next case," says Pamela McNutt, senior vice president and CIO of Methodist Health System, Dallas. One of the complaints is, "You're asking the doctor to fill out a lot of little click-y boxes where they used to just talk about the patient," she says.
Until recently, work to ferret out facts in what a doctor wrote was left to others. "Before, somebody would just read through it and say, 'Oh look, this person had a contraindication and that's why we didn't prescribe it,' " says McNutt. "But in order for the system to calculate that automatically, someone has to click a box."
To get to the real-time reporting of performance, IT systems have to add places in the normal flow of things where data is sure to be captured, documented and retrievable, says Charles Christian, CIO of Good Samaritan Hospital, Vincennes, Ind. The facility's IT pros have changed a significant number of charting screens for nurses and added items in the provider ordering system to capture elements needed for metrics.
It's up to the clinicians to use these tools and to understand why, says Linda Lockwood, associate partner with Encore Health Resources. At one academic medical center with a 700-physician practice, she says, doctors were pushing back until physician leadership stepped in and made clear that metrics are going to be the report card on doctors' performance and how they'll be paid in the future.
But changing physician attitudes won't be easy, says Karen Marhefka, associate CIO of UMass Memorial Medical Center, Worcester. She's overheard comments from doctors like, "I'm not a team player, I'm a physician," and that a team- and community-based approach to care is something physicians "don't feel natural with" if they've been in the field a long time. Given that outlook, many see themselves as already too busy to enter the data that make reporting work.
"Right now, this moment, there's a doc seeing a patient here saying, 'I don't know if I want to put that [item of metrics data] in there, and I have to see 20 more patients before 5 o'clock and I'm just going to skip over this stuff, because I just don't feel it's important,'" says Marhefka. "That's a physician who has not had the opportunity to understand the value of what we're asking him or her to do." — John Morrissey
Quality measurement is more than running down "core" and "menu" metrics in meaningful use Stage 2 and showing that a calculation can be done electronically. Success is measured in the readiness, willingness and ability to get it right and serve performance-based objectives. Senior management will have to:
Gain acceptance, trust of physician staff.
Doctors not only produce much of what is reported, but also will be the most affected by the qualitative judgments that result from the data. "Many provider organizations are struggling with, 'Where are we going to get it?' 'Do we trust it?' 'Is it meaningful — is it what we really want to report?' " says Karen Marhefka of UMass Memorial Medical Center, Worcester. Physicians should know that their efforts to input accurate and thorough information are key to making it trustworthy.
Push for workflow changes.
Part of that is introducing logical places to capture needed data elements. A certified EHR by itself "doesn't mean that the workflows work well," says CIO Charles Christian. "It just means we have the right tools — now, are we using them?" Another facet is getting clinicians to cooperate. Many doctors don't like using the digital problem list, for example, but that's where the data to support measures will often reside, says Linda Lockwood of Encore Health Resources.
Determine who should collect/enter key data.
Pharmacy techs, transcriptionists, quality-assurance coordinators, coders who document during hospital stays all can take part of the load. The first priority is to determine what processes can be computerized, and "what can't be automated can be delegated" based on the skill level required, says Mission Health CEO Ron Paulus. "Why do you want the doctors to do all this structured data entry when someone who is less expensive can do the same thing?"
Consider all emerging tools to report measures.
The Office of the National Coordinator for Health Information Technology issued EHR certification criteria so IT systems will have the functions providers need to produce quality work, but it's also pursuing other ways for providers to get what they should from data. For example, an open-source quality measure reference implementation called popHealth was initiated to achieve in EHRs "the intent of the government and the measure stewards when it comes to the actual quality metric reports," says Rob McCready, a MITRE Corp. expert involved in the effort.