Here at HPE, we are constantly introducing innovative products and services and then continuously improving them to ensure that our technologies are keeping up with our customers’ changing needs. But what happens when we realize that simply scaling the existing technologies won’t be enough to keep pace with progress?
We’re seeing our customers—and the industry as a whole—dealing with a massive onslaught of data. This huge and complex amount of data is growing at an exponential rate. We’re struggling to keep pace today. Towards the end of this decade, it will come at us at a rate that surpasses the ability of our current infrastructure to evolve in order to ingest, store and analyze it. A step change in computing technology is required.
Wouldn’t it be great if you had seemingly unlimited compute power?
- You could inspect and classify every bit of data entering and leaving your enterprise
- You could analyze a trillion customer relationship management records in the blink of an eye
- You could expand the bandwidth and storage of all your data centers tenfold while slashing your energy consumption
- Your doctor could compare your symptoms and genomics with every other patient around the world to improve your health outcomes, instantly, without language barriers or privacy breaches
This is why we are building The Machine.
Why do we call it The Machine? When we first started developing it, we wanted to be very careful not to call it a server, workstation, PC, device or phone, because it actually encompasses all of those things. So as we were waiting for Marketing to come up with a cool code name for the project—we started calling it The Machine and the name stuck.
Turning research into reality
Behind closed doors, Hewlett Packard Labs has been developing the fundamental components of a solution that will have the horsepower to take in all of the future’s information and process it in new ways. Developing The Machine not only means building the hardware, but also investing in the software that will support it—the data algorithms, the operating systems, the security platform and the tools required to manage millions of compute nodes from servers and data centers to the smart sensors that will make up the internet of things.
Here are just a few of challenges that The Machine is being built to address:
- Data governance and security issues are top of mind—not only for organizations, but also for the average individual. Recent events have made it clear that organizations can’t afford to take a passive or reactive approach to the way they protect their information. The Machine will make possible the secure storage, aggregation and transmission of never-before-imagined amounts of data.
- Most organizations are only able to use big data analysis to understand what is happening in hindsight—existing tools are being used to process things that have already happened. The Machine is designed to enable truly real-time insight. This will help organizations transition from having purely reactive insight to developing the foresight that lets them predict what is going to happen.
- The notion of simply continuing to expand the current data center model isn’t a feasible one. Today, Big Data means bringing all the data into one place. Tomorrow, some of data will be too big and too expensive to move. Tomorrow’s analytics will work where the data is created, transforming data locally into intelligence, which is then sent to a centralized learning engine powered by The Machine. The Machine not only increase performance, it will also greatly reduce the amount of energy that is needed to achieve those speeds.
This is a revolution
We believe we are the only company left that has the breadth of technology, systems engineering skill, innovation culture and sheer will to put all of these elements together and make it work. With The Machine we have the opportunity to rethink security, data governance, data placement and data sovereignty from ground up and embed it into all of our products. This revolutionary project is on its way to changing the industry and the way we compute.