In an era with hospitals are increasingly expected to perform better while using fewer resources, just how much do information technology investments help hospital bottom lines even as they consume more and more money to maintain? And with a long list of potential spends, which IT outlays are critical? I'll be writing about this topic for an upcoming spread in H&HN, and the view from the field suggests that while hospitals see IT spending as an essential element of competing in a transformed delivery system, keeping up with a slew of new IT capabilities can be time-consuming and costly.

A new report from PricewaterhouseCoopers, which explores the overall fiscal challenges facing academic medical centers, estimates that AMCs stand to lose up to 10 percent of revenue from reform pressures and other reimbursement shifts in coming years. The report recommends, among other things, that AMCs make major IT investments in everything from EHRs to telemedicine services and simulation centers, which it suggests will be necessary for survival in the long term. The report found that 90 percent of AMCs have begun investing or plan to hire additional IT staff to manage data and systems, and 54 percent plan to collaborate with other research centers or medical centers to share EHRs over the next five years.

There are stumbling blocks, of course; few AMCs are in a strong enough position to lead such efforts and handle the security and HIPAA issues needed to shepherd sensitive patient data from provider to provider, the report found. The report also noted that only a fifth of health care organizations sharing data externally have a process for managing patient consent for data sharing.

On the telemedicine front, the report found that 69 percent of AMCs surveyed planned to expand their telemedicine offerings, and offered up University of Massachusetts Memorial Health Care's $8 million tele-ICU initiative with two community hospitals as an example of how telemedicine can both improve quality and the overall bottom line. The initiative led to a reduction in mortality of 20 percent, enabled the two community hospitals to see 50 percent more patients and led to a $2,600 reduction in costs per patient at Memorial.

Another report, released last year by Fitch Ratings, looked at 291 hospitals in its highest portfolio class, of which 24 were deemed stage 6 or stage 7 meaningful use ready by HIMSS standards. Those hospitals had 46 percent higher revenue on average than the other hospitals included in the survey and also performed somewhat better on quality outcomes, experiencing a 1 percent decrease in length of stay in 2010. Of course, there's another way to look at those numbers, which is that profitable hospitals are more likely to have the cash on hand for major IT investments — especially those who can get several years ahead of the game on meaningful use criteria.

And still another recent report, from the Optum Institute for Sustainable Health, paints a more mixed picture; while 87 percent of hospitals surveyed have implemented an EHR, and 70 percent have attested successfully for stage 1 meaningful use, the CIOs surveyed complained of increasing costs to handle increased capabilities, including increased costs in interoperability, licensing agreements, system modifications, vendor support, upgrades and the purchasing of new systems.

And those are just big picture statistics that don't take into account the unique challenges any hospital faces when administrators, vendors, doctors and other clinicians have to work together to install a new system, change longstanding workflows and successful implement a new way of providing care. I'd like to hear from you, the reader though, about real-world health IT initiatives that have led to meaningful financial improvements or savings in your hospital. Have a story about harnessing IT to improve the bottom line? Email me at hbush@healthforum.com.