Accelerating advances in computing will affect health care. More computing power equals more automation; greater efficiency; lower costs; better clinical, administrative and logistical decisions; better clinical research and understanding; and, ultimately, better care for the patient and better support for those delivering that care — those left standing, that is, after the tsunami hits.
In fact, the tsunami began heading up the beach in the 18th century, with the beginning of modern medicine — the acceptance of the scientific method and the germ theory of disease. The tsunami swelled over the next two centuries as the X-ray, penicillin, better anesthesia, imaging technology, minimally invasive surgery and robotic surgery brought us to the present era of postmodern medicine — personalized, bionic, regenerative and digital.
The tsunami continues to grow bigger and stronger; therefore, we are that much higher on the exponential curve of acceleration. So much higher, it has taken only five years for what was once the world's fastest supercomputer, IBM's Roadrunner, to be consigned to the scrap heap. It has taken about the same time for cloud computing — a form of "big data" computing once affordable only to Pentagons and Amazons and Googles — to become affordable to medium-sized organizations such as hospital systems.
Tidal Wave of Data
Big data are, or should be, of immense interest to hospitals. First, because — like it or not — hospitals are going to amass data like never before. Hospitals are, or should be, interested in the molecules that make up their patients. Those molecules far outnumber the books and songs and gizmos in which Amazon is primarily interested. Secondly, hospitals must and will analyze and use big data to make quantum improvements in the diagnosis and treatment of the patient and in the efficient, safe and effective provision of services.
Big data include not only data that currently reside in clinical, administrative, logistical and billing databases (which amount to fairly big data), but also the genomic, proteomic, glycomic and other -omic data we will soon be collecting routinely for every patient (which amount to huge data). Big data include the health status and vital signs data starting to pour in from sensor-laden devices located in, on and around the patient, which also will amount to huge data. (See Scott DiDonato's excellent article "Integrating medical equipment with electronic health records" in the March 2013 issue of Health Facilities Management magazine for a thorough review of this particular topic.)
Quantum improvements in diagnosis and treatment will come from the manipulation of all these data in the form of molecular-level, patient-specific simulation — models — all the way down to the genome and metabolome. It took IBM's Roadrunner to do that just for a tiny virus. What about doing it for a human being?
During its brief but distinguished existence on this earth, Roadrunner was used to model viruses as well as nuclear explosions, unseen bits of the universe and other fascinating things. It was installed at the Los Alamos National Laboratory in 2008 and decommissioned just a couple of months ago, in March 2013. It cost $121 million and was the first computer to process in petaflops (quadrillions of mathematical calculations per second). Roadrunner's replacement, Cielo, is smaller, faster, more energy energy-efficient, and it cost only $54 million. But Cielo also is still far from being able to manipulate the complexity of a human simulation.
So how do we get from the molecular model of a virus to the molecular model of a patient? The answer: through the acceleration of computing power, through harnessing the power of the tsunami.
A supercomputer is really not "a" computer. It is thousands of computers (processors) linked together physically and with software that enables them to work cooperatively and in parallel on massive tasks such as modeling a virus. Roadrunner had 278 refrigerator-size racks, each teeming with processors linked to one another by 55 miles of fiber-optic cable.
So, to get even more power to tackle tasks even bigger than the modeling of a tiny virus, an obvious next step is to link together many Roadrunners and get them to work cooperatively on tasks as well as to make the individual processors faster and smaller. That is ultimately where cloud computing will take us. It might take a while before a supercomputer or cloud computer will be ready for the task of simulating a human down to the molecule, but it will be only a while.
Amazon.com started the cloud computing business around the time Roadrunner was built. Today, we take for granted the ease with which we can find and order just about anything under the sun from Amazon, in split seconds, and we (billions of us, around the globe) take for granted Amazon's ability to recognize instantly who we are and to know all about our individual tastes in books, music, films, clothes, gadgets … . It took what seemed, way back in 2006, like massive power to do that, but that kind of power is available to your hospital today, off the shelf, starting at $100,000. You might even be able to order it from Amazon for express delivery tomorrow, in the form of a cloud computer called Nebula One.
Nebula One is billed as the world's first off-the-shelf cloud computer. It enables an organization to operate its own, private, cloud infrastructure and thereby control vast (by today's standards, that is) computing, storage and networking resources through a dashboard that gives administrators control. At the same time it empowers users to provision resources through a Web interface — all by just plugging in Nebula One and turning it on. (Yes, I am paraphrasing Nebula's sales pitch but, no, I do not own stock in the company!)
The Time Has Come
A Los Alamos official told the Associated Press that "where we're going to be in the next 10 to 15 years" is "just mindboggling." And he should know — it's the business of Los Alamos National Lab to plan to boggle the mind. Hospital executives need to heed his message, because it applies to their minds, too, and that means it will apply to their budgets, their information technology infrastructure, their planning for staff and equipment and supplies and … .
There is no aspect of hospital operations and business that will not be boggled in the next five to 10 years by the accelerating growth of data and the power to analyze it. It is time to start doing it. The tools are here, and the Health Forum publications alone have plenty of good advice to get you started. In addition to Scott DiDonato's article, John Glaser has written some highly insightful and practical columns in H&HN Daily on big data and advanced analytics.
It's time to take advantage of the opportunity to improve your service. It's time to take the plunge.
David Ellis is a futurist, author, consultant and publisher of Health Futures Digest, a monthly online discursive digest of news and commentary on long-range, leading-edge technological innovations and their consequences and implications for health care policy and practice. He is also a regular contributor to H&HN Daily and a member of Speakers Express.