To help personalise content, tailor your experience and help us improve our services, Bisinfotech.com uses cookies.
By navigating our site, you agree to allow us to use cookies, in accordance with our Privacy Policy.

HPE’s Latest Computer is a Memory Mammoth for Big Data Era

Prototype from The Machine research program upends 60 years of innovation and demonstrates the potential for Memory-Driven Computing

HPE announced to have developed world’s largest single-memory computer. The latest development is a part of Hewlett-Packard Enterprise (HPE) largest R&D program named The Machine research project (The Machine).

Hewlett Packard Enterprise is Latest Computer is a Memory MammothReaching its first major milestone, HPE latest computer contains 160 terabytes (TB) of memory – capable of simultaneously working with the data held in every book in the Library of Congress five times over – or approximately 160 million books.

HPE claims that it has never been possible to hold and manipulate whole data sets of this size in a single-memory system, and this is just a glimpse of the immense potential of Memory-Driven Computing.

“The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day,” said Meg Whitman, CEO of Hewlett Packard Enterprise. “To realize this promise, we can’t rely on the technologies of the past, we need a computer built for the Big Data era.”

Scalability & Societal Implications

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory – 4,096 yottabytes. For context, that is 250,000 times the entire digital universe today.

Technical Specifications

The new prototype builds on the achievements of The Machine research program, including:

  • 160 TB of shared memory spread across 40 physical nodes, interconnected using a high-performance fabric protocol
  • An optimized Linux-based operating system (OS) running on ThunderX2, Cavium’s flagship second generation dual socket capable ARMv8-A workload optimized System on a Chip.
  • Photonics/Optical communication links, including the new X1 photonics module, are online and operational; and
  • Software programming tools designed to take advantage of abundant persistent memory.

With that amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time – getting to answers and uncovering new opportunities at unprecedented speeds.

“We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO at HPE and Director, Hewlett Packard Labs. “The architecture we have unveiled can be applied to every computing category – from intelligent edge devices to supercomputers.”

The Machine is the largest R&D program in the history of the company, is aimed at delivering a new paradigm called Memory-Driven Computing – an architecture custom-built for the Big Data era.

To learn more about Memory-Driven Computing and The Machine research program, please visit: www.hpe.com/TheMachine.

Tags
Show More

Niloy Banerjee

A generic movie-buff, passionate and professional with print journalism, serving editorial verticals on Technical and B2B segments, crude rover and writer on business happenings, spare time playing physical and digital forms of games; a love with philosophy is perennial as trying to archive pebbles from the ocean of literature. Lastly, a connoisseur in making and eating palatable cuisines.

Related Articles