By navigating our site, you agree to allow us to use cookies, in accordance with our Privacy Policy.

World’s First Memory-Driven Computing Architecture from HPE Mutates Computing World

Memory-Driven Computing

Hewlett Packard Enterprise announces to feet first to demonstrate a unique Memory-Driven Computing, a concept that puts memory, not processing, at the center of the computing platform to realize performance and efficiency gains not possible today.

Developed under The Machine research program, the IT Giant’s new proof-of-concept prototype is claimed to be a major milestone in the company’s efforts to transform the fundamental architecture on which all computers have been built for the past 60 years.

Citing on this ground-breaking development, Som Satsangi, VP & GM, Enterprise Group, Hewlett Packard Enterprise India, said, “We have achieved a major milestone with The Machine research project – one of the largest and most complex research projects in our company’s history. With this prototype, we have demonstrated the potential of Memory-Driven Computing and also opened the door to immediate innovation. Our customers and the industry as a whole can expect to benefit from these advancements as we continue our pursuit of game-changing technologies.”

First online in October, the fundamental building blocks of the new architecture working together,

HPE has demonstrated:

  • Compute nodes accessing a shared pool of Fabric-Attached Memory;
  • An optimized Linux-based operating system (OS) running on a customized System on a Chip (SOC);
  • Photonics/Optical communication links, including the new X1 photonics module, are online and operational; and
  • New software programming tools designed to take advantage of abundant persistent memory.

company has run new software programming tools on existing products, illustrating improved execution speeds of up to 8,000 times on a variety of workloads. HPE expects to achieve similar results as it expands the capacity of the prototype with more nodes and memory.

In addition, to bringing added capacity online, The Machine research project is sought to increase focus on exascale computing. Exascale is a developing area of High Performance Computing (HPC) that aims to create computers several orders of magnitude more powerful than any system online today. HPE’s Memory-Driven Computing architecture is incredibly scalable, from tiny IoT devices to the exascale, making it an ideal foundation for a wide range of emerging high-performance compute and data intensive workloads, including big data analytics.

Further, the company asserts its commitment to rapidly commercialize the technologies developed under The Machine research project into new and existing products. These technologies currently fall into four categories: Non-volatile memory, fabric (including photonics), ecosystem enablement and security.

Non-Volatile Memory (NVM)

HPE continues its work to bring true, byte-addressable NVM to market and plans to introduce it as soon as 2018/2019. Using technologies from The Machine project, the company developed HPE Persistent Memory – a step on the path to byte-addressable non-volatile memory, which aims to approach the performance of DRAM while offering the capacity and persistence of traditional storage. The company launched HPE Persistent Memory in the HPE ProLiant DL360 and DL380 Gen9  servers.

Fabric (including Photonics)

Under photonics research, HPE has taken steps to future-proof products, such as enabling HPE Synergy systems that will be available next year to accept future photonics/optics technologies currently in advanced development. Looking beyond, HPE plans to integrate photonics into additional product lines, including its storage portfolio, as soon as 2018/2019. The company also plans to bring to market fabric-attached memory, leveraging the high-performance interconnect protocol being developed under the recently announced Gen-Z Consortium, of which HPE recently joined.

Ecosystem Enablement

Much work has already been completed to build software for future memory-driven systems. HPE launched  Hortonworks/Spark collaboration this year to bring software built for Memory-Driven Computing to market. In June 2016, the company also began releasing code packages on Github to begin familiarizing developers with programming on the new memory-driven architecture. The company plans to put this code into existing systems within the next year and will develop next-generation analytics and applications into new systems as soon as 2018/2019. As part of the Gen-Z Consortium HPE plans to start integrating ecosystem technology and specifications from this industry collaboration into a range of products during the next few years.

Security

With this prototype, HPE demonstrated new, secure memory interconnects in line with its vision to embed security throughout the entire hardware and software stack. HPE plans to further this work with new hardware security features in the next year, followed by new software security features over the next three years. Beginning in 2020, the company plans to bring these solutions together with additional security technologies currently in the research phase.

­

Tags

BiS Team

BIS Infotech is a vivid one stop online source protracting all the exclusive affairs of the Consumer and Business Technology. We have well accomplished on delivering expert views, reviews, and stories empowering millions with impartial and nonpareil opinions. Technology has become an inexorable part of our daily lifestyle and with BIS Infotech expertise, millions of intriguers everyday are finding for itself a crony hangout zone.

Related News

Upcoming Events