In this article
- Artificial intelligence, deep learning and advanced analytics require a set of new, open building blocks that will enable a level of innovation unavailable to a closed, proprietary set of standards.
- In 2016, HPE joined other industry leaders in founding Gen-Z in an effort to develop a new universal interconnect that would enable simpler and more powerful computer architectures.
- Gen-Z will influence the future of our high-performance computing platforms and is an important part of our Memory-Driven Computing architecture.
Gen-Z, the consortium formed to solve the challenges associated with processing and analyzing huge amounts of data in real time, today released its Core Specification 1.0, which will enable microchip designers and fabricators to begin the development of products enabling Gen-Z technology solutions. We sat down with Alain Andreoli, Senior Vice President and General Manager of HPE’s Hybrid IT Group, and Kirk Bresniker, Hewlett Packard Labs Chief Architect and HPE VP/Fellow, to learn more about the announcement.
Q: Can you give us an understanding of what Gen-Z announced today and how it supports the consortium’s mission?
Alain: Gen-Z was created to ensure there is freedom of action across the industry to more closely work together and to fully integrate new innovation. The demand for increased computing performance cannot be met by simply scaling the multiple different interconnects in use today. The emerging set of compute-intensive workloads such artificial intelligence, deep learning and advanced analytics require a set of new, open building blocks that will enable a level of innovation unavailable to a closed, proprietary set of standards.
You might recall that in October 2016, HPE joined other industry leaders in founding Gen-Z in an effort to develop a new universal interconnect that would enable simpler and more powerful computer architectures. The goal was to develop a new standard interconnect and protocol – one single, open, high-performance interconnect.
The core specification will enable microchip designers and fabricators to begin the development of products enabling Gen-Z technology solutions. Making a custom silicon chip is a significant investment in time and resources. Clearly, an established specification enables companies to invest with confidence and will accelerate Gen-Z to market.
Q: What is the scale of the impact can we expect from implementing one standard interconnect?
Alain: This work is important because it will allow any component – processing, memory, accelerators, networking – to talk to any other component as if it were communicating with its own local memory using simple commands. It’s what we call a “memory-semantic protocol,” which is a fancy way of saying the same thing! It allows us to construct the perfect computer for any task, assembling any combination of components for maximum performance and power-efficiency. Even better, it simplifies hardware and software designs, reducing solution cost and complexity.
And talking about scale, I’d like to emphasize this isn’t just about building big computers.
Q: What does today’s announcement mean for HPE?
Alain: The public release of the Gen-Z Core Specification 1.0 is a significant milestone to enable an open ecosystem of innovation in computing devices of every size, from the IoT edge to data centers to supercomputing. As one of the founding Gen-Z consortium members, Hewlett Packard Enterprise will leverage Gen-Z’s fabric technology to advance HPE’s Memory-Driven Computing agenda to deliver an entirely new way of computing that will power the next wave of high-performance and data analytics applications for our customers.
HPE is committed to delivering industry leading IT solutions to our customers based on innovations that have great business value. Gen-Z technology will have that impact, and our customers can expect to benefit from these new capabilities in the coming years.
Kirk: One class of application that Gen-Z makes possible is the manipulation of huge data sets residing in large pools of fast, persistent memory providing the foundation of HPE’s Memory-Driven Computing strategy. The prototypes from The Machine research program are an early example of a memory-semantic fabric. Gen-Z will influence the future of our high-performance computing platforms and is an important part of our Memory-Driven Computing architecture. Memory-Driven Computing enables critical leaps in performance and allows the industry to move beyond today’s constrained architectures.
In fact, the PathForward program with the U.S. Department of Energy is enabling the industry to collaborate on the development of supercomputers that harness technologies encouraged by an open and competitive ecosystem. At the core of this strategy is the use of the Gen-Z communications protocol, which offer dramatic improvements in application performance and power efficiency.
Q: Why the focus on memory?
Kirk: This is an important question. Memory-Driven Computing, isn’t just about memory. It’s about everything in a computer talking to everything else as if they were talking to memory, which means simple commands, completed in nanoseconds. You can build a system with huge amounts of shared memory, but you can also build something that looks like a better internet server, or a massively-parallel supercomputer, or an ultra-efficient machine learning-enabled IoT device. This is HPE’s vision for computing in the big data era.
Q: How is today’s computer architecture falling behind?
Kirk: Across industries, the explosion of data, and the demand for real-time analysis of that data is changing the speed of business. But today’s computer architecture is holding us back. The comfortable cadence of automatic improvements described by Moore’s Law is coming to an end. And moreover, we need more flexibility and performance than current architectures, where every action is funneled through a conventional microprocessor, are capable of. We need new solutions.
People in the industry used to ask “Can’t we make incremental improvements to keep Moore’s Law going?” No. Gen-Z is evidence that the industry now sees that the end of Moore’s Law will require adoption of a memory-centric architecture to thrive.
Q: What will the innovations coming out of this ecosystem mean for science?
Kirk: There are many big problems in our world—from disease treatments to disaster prevention--that are simply not solvable with our current computing capabilities because they involve much larger data sets than we can hope to process in any useful time. These problems require the power of a new breed of supercomputers called exascale systems – ten times more powerful than the biggest supercomputers on the planet today.
We believe a memory-centric architecture is the answer to creating these exascale systems and that Gen-Z is helping us get there.
Q: Any last words?
Alain: By enabling technologists to collaborate and contribute to an open and competitive ecosystem, Gen-Z will help the industry fundamentally change how the world thinks about computing.
We look forward to helping make this vision a reality.