In October 2010, IBM announced a research collaboration project with the Columbia University Medical Center that could potentially help doctors to spot life-threatening complications in brain injury patients up to 48 hours earlier than with current methods. In a condition called delayed ischemia, a common complication in patients recovering from strokes and brain injuries, the blood flow to the brain is restricted, often causing permanent damage or death. With current methods of diagnosis, the problem often has already begun by the time medical professionals see the data and spot symptoms.
IBM’s collaboration with Columbia University can deliver critical insights much faster, using “streaming analytics,” an IBM method of data analysis “on the fly.” Medical professionals are able to use IBM’s stream computing platform, IBM ® InfoSphere ® Streams, to analyze data from more than 200 variables—including lab tests and diagnostic readings—and spot patterns that can lead to earlier diagnoses, so doctors can more quickly treat complications like delayed ischemia. In 2011, the world is ten times more instrumented than it was in 2006. The number of Internet-connected devices has leapt from 500 million to 1 trillion. We create 15 petabytes of new data every day. The rapidly increasing instrumentation of our society means we have access to tremendous streams of data. Our methods for analyzing and using that data, however, have not kept pace. In traditional data analysis, we ask questions of a static set of data. In stream computing, data can be pulled in from multiple continuous streams and analyzed in real time—so it can be used for critical situations in which an answer to a complex question is needed quickly, such as using biomedical data to monitor the condition of critically ill premature babies to detect life-threatening infections.
It’s not only about saving lives, but also about giving these babies the best, healthiest start we can. This is a very good example of where—rather than the one-to-one relationship that doctors have—we have the potential to help thousands.
Dr. Carolyn McGregor
CANADA RESEARCH CHAIR IN HEALTH INFORMATICS, UNIVERSITY OF ONTARIO INSTITUTE OF TECHNOLOGY
“A new data-capture project is helping Sick Kids care for its most vulnerable patients,”
Backbone Magazine, October 1, 2009
“It’s the difference between taking a snapshot of a two-year-old … and a live video feed,” says Nagui Halim, IBM Fellow and chief scientist of IBM’s stream computing initiative. “The snapshot is frozen in time. But, with the live video feed, you can capture every nuance, follow the child out of the room, shift the focus, sharpen the image, record it for future use and edit out the uninteresting bits.” IBM researchers spent half a decade transforming the vision of stream computing into a product—a new programming language, IBM Streams Processing Language, was even built just for streaming systems. Stream computing’s capability makes possible advances in any industry wrestling with the challenge of processing the flood of data created every day—healthcare, telecommunications, utility companies, municipal transit, national security and more. Imagine the ability to monitor traffic in a city by using road network data and global positioning system (GPS) information about vehicle locations to reconstruct traffic patterns in real time on an interactive map. Or analyzing semiconductor manufacturing lines for errors in microchip production, potentially spotting defects in minutes rather than days. In one application, a client used the InfoSphere Streams platform to process five million messages of market data per second, with the processing of each message taking only a few tens of microseconds, helping to speed financial trading decisions. This is the equivalent of streaming the text of the collected works of Shakespeare ten times—in less than one second. As our world becomes more complex, we need tools to understand it. Stream computing systems provide a window into today’s data, where before we only had access to yesterday’s.