In this article

  • Big data and analytics are changing the face of healthcare, smart cities, transportation and retail
  • Memory-Driven Computing can solve problems thousands of times faster than any current computer
  • Register to receive breaking news and updates on The Machine research project and Memory-Driven Computing
From healthcare and transportation to city services and retail, Memory-Driven Computing will unleash new opportunities for problem solving and innovation

We have been in the age of big data for years now, but getting better at collecting it hasn’t made us that much better at managing and putting it to good use.

For years, we have relied on our ability to throw more and more general-purpose microprocessors at a problem to help us keep up with our growing data processing needs. But as our data sets continue to multiply and become more complex, splitting problems between these cookie-cutter processors, each of which has a small amount of its own memory is fast becoming a dated—and less efficient—solution.

To unlock the insights and intelligence that live within our ever-growing volumes of data, we’re going to need an entirely new computing architecture. We call it Memory-Driven Computing.

The team behind HPE’s Machine research project, is designing the first Memory-Driven Computing architecture, which will be built around memory rather than processors, where memory is central to the system, not just something tethered to a processor.

Such a system could hold hundreds of terabytes or even petabytes in memory at one time and could solve problems thousands of times faster than a conventional computer.

So what does that mean for industries and sectors as disparate as healthcare, transportation, city services and retail? It means you will be able to get insights and intelligence from your data exponentially faster and more efficiently, unleashing entirely new opportunities for problem solving and innovation.

Healthcare

Healthcare data is set to become one of the biggest computational challenges we will face over the next decade.

The current “prescribe what works for most people” approach to healthcare has been accepted as the standard for too long. The problem is that it can take time to find a solution if you’re not like most people, which equates to doctors making educated guesses about diagnoses and trying out various treatments until something works. But the precision-medicine revolution is within reach, and Memory-Driven Computing could make the vision a reality.

To devise personalized diagnoses and treatments, doctors and researchers need access to a patient’s entire medical history, as well as family health history, lifestyle data and environmental exposures. They also need genomic data and data on the patient’s microbiome (all of the microorganisms that live in the human body). Then there are the specific dimensions and peculiarities of internal organs, bones and brain chemistry. Analyzing all that in the context of billions of other medical profiles would identify matches of people, diagnoses and treatments or therapies that worked best for people most similar to the patient as opposed to treatments that work best for the average patient.

Such a catalog of variables represents a dizzying amount of data, even without considering the connected healthcare devices in hospitals and wearable health technologies that are creating minutely personal digital blueprints for each of us.

Memory-Driven Computing systems will be essential to effectively synthesizing all of this information. David Bader, professor and executive director of high-performance computing in the Georgia Tech College of Computing, says trying to analyze data to sufficiently create useful clinical insights about the origins of a particular disease “could take me hundreds of years with a current computer system. But with a Memory-Driven Computing system, I may be able to take those algorithms that I want to compute, run them and get an answer in five minutes.”

Such modeling and simulation will allow doctors to address some of the most intractable scourges of human health: cancer, heart attacks, chronic diseases and HIV. It could also help to define and cure diseases that are exceedingly rare and not yet susceptible to diagnosis. And with more robust predictive analytics, Memory-Driven Computing could actually head off major illnesses before they occur. Big data is already helping doctors treat disease and the future of computing could lead to the prevention of disease altogether.

Transportation

Americans spend 4.2 billion hours each year sitting in traffic, which saps $87.2 billion from the economy. Likewise, about 20 percent of all commercial airline flights are delayed, which usually has a domino effect throughout the air transport network, wreaking general havoc with reservations, baggage handling, gate changes, crew assignments and security.

The smart airports and transportation systems of the future will be equipped to solve, if not eliminate, these complex problems. But to do that they will need to be faster and more connected. “A lot of what we want to implement for traffic control is short term. You’ve got to be able to do that fast,” says Mike Hunter, a transportation operation and design specialist and associate professor at the School of Civil and Environmental Engineering at Georgia Institute of Technology. Memory-Driven Computing will allow transportation systems to predict problems “faster and with bigger datasets,” says Hunter.

For instance, an airport could simulate an almost infinite number of potential delays at different places around the terminal and hold the solution in memory so that it would know exactly how to respond the instant one of those delays occurred.

Similarly, in the event of a major infrastructure failure—say, the collapse of a critical bridge as recently happened in Atlanta—a city could already know how to optimize traffic flows by coordinated traffic signaling and routing on mobile maps using real-time analysis of complex data sets held in memory to safely and efficiently avoid jams. This is simply impossible on a traditional computer where it could take weeks to model the entire city of Atlanta and account for all the travelers in the region and then devise an appropriate work-around solution.

Smart Cities

What if pedestrian fatalities in urban areas could be completely eliminated? It’s a future city planners call Vision Zero, first implemented in Sweden in the 1990s, which is now spreading across the United States. One approach would involve using smart-city sensors to find the intersections that have the most near misses and thus the highest risk of pedestrian fatalities. The intersection could then be altered and made safer.

“Usually, doing an investigation after the fact is too late,” says Pete Beckman, co-director of the Northwestern-Argonne Institute of Science and Engineering in Chicago, who is helping to head up the city’s Array of Things smart cities project. For the predictive approach to work, says Beckman, you would need to detect these patterns right where and when they were happening. The algorithm would have to be plugged into the camera itself, at the so-called “edge,” rather than processing the data remotely or in the cloud.

“This notion of processing and writing code for a smart city in the device, whether that’s the traffic light, or the car, or the trash can, or the LCD advertising display, or the bike—this is a pretty new idea,” says Beckman. “Memory-Driven Computing is a technology that can help enable this kind of edge computing.”

Vision Zero is just one example of how Memory-Driven Computing could be applied in a smart city. By 2018, Chicago aims to place 500 node sensors in its streets and parks, including air quality monitors, microphones and cameras. Residents’ mobile phones, connected cars and self-tracking devices will offer an additional layer of data. Working with this many variables at once could yield all kinds of urban safety and efficiency gains, but that depends on being able to access, integrate and act on all of that data instantly.

Retail

Imagine a future in which your favorite retailers anticipate your shopping needs, tastes and budget more accurately than your best friend can. When you’re near your favorite store, for example, your smart phone might notify you about a special deal on travel accessories that are perfect for that trip to London you recently booked online.

Welcome to the age of personalized retail. The more retailers know about their customers’ shopping history, future plans and point-of-sale activity, the faster and better they can bring them products they need. There’s no shortage of raw information: Retailers are set to invest billions by 2020 on beacons, sensors and radio-identification tags. But all this data is useless unless companies can identify trends and customer needs before they expire, without trespassing on customers’ privacy or making them feel spied upon.

The architecture of current computing systems makes them incapable of crunching enormous data sets without long lag times. But systems built on HPE’s Memory-Driven Computing architecture could deliver enormous gains in analytic capacity and provide actionable insights within the attention span of a shopper. Imagine looking at the entire purchase history of a customer compared to the shopping trends of 100 customers most similar to them—all while the customers is still in the aisles.

“I think the ability to step in and interactively engage with customers while they’re there is the frontier of where retailers want to go,” says James Connor, director of architecture for Data Science and Analytics, 84.51°, the in-house analytics arm for supermarket chain Kroger. “That requires machine learning and predictive analytics, which currently is ‘depressingly slow.’ Memory-Driven Computing could help solve that,” he says.