With Functional Proof of Concept for Compute Express Link™ (CXL™), Companies Share a Vision of Performance- and Efficiency-Centric Systems Built for the Dynamic Memory Era
LAS VEGAS, May 03, 2022–(BUSINESS WIRE)–Liquid, the global leader in software delivering data center composability, today announced that the company is collaborating with Samsung, the world’s largest memory technology provider, and Tanzanite Silicon Solutions, the pooling technology leader to demonstrate Composable Memory via Compute Express Link™ (CXL™) 2.0 at Dell Technologies World 2022 (Booth #242). Providing high-speed CPU-to-memory connections for the first time, CXL decouples DRAM from the CPU, the last piece of hardware to disaggregate. With native CXL support, Liqid Matrix™ Composable Disaggregated Infrastructure (CDI) software can now cluster and compose memory in tandem with GPUs, NVMes, persistent memory, FPGAs, and other accelerator devices. By making DRAM a composable resource on CXL fabrics, Liqid, Samsung and Tanzanite are demonstrating the efficiency and flexibility needed to meet changing infrastructure demands driven by rapid advances in artificial intelligence and machine learning. (AI+ML), edge computing and hybrid. cloud environments.
“With the groundbreaking performance CXL delivers, the industry will be better positioned to support and make sense of the massive wave of AI innovation expected over the next few years, and we’re excited to partner with Samsung and Tanzanite to exemplify the power of this new protocol,” said Ben Bolles, Executive Director, Product Management, Liquid. “As our demonstration illustrates, by decoupling DRAM from the CPU, CXL allows us to achieve these breakthrough performance results, more sustainable infrastructure flexibility and resource efficiency, preparing organizations to meet the architectural challenges industries face as AI evolves at the speed of data.”
According to Gartner, the use of AI in the enterprise has tripled in the past two years, and by 2025 AI will be the primary driver of infrastructure decision-making as the AI market is maturing, which will lead to a 10-fold increase in computing needs over the same period.1
Adopting CXL technology will significantly help meet this growing demand for extended compute performance, efficiency, and durability not possible with traditional architectures. This allows previously static DRAM resources to be shared for exponentially higher performance, reduced software stack complexity, and lower overall system cost, allowing users to focus on accelerating time to results for target workloads rather than maintaining physical hardware.
With a wave of CXL-supported servers becoming commercially available, the composability built into any server refresh will both continue to use existing resources, while deploying DRAM as a shared bare metal resource that can be used in tandem with acceleration technologies. already available for Liquid Matrix™ CDI software. Recognizing the urgent need for computational performance and efficiency, Liqid Matrix software leads the industry in recognizing CXL as a type of framework.
Samsung is the global leader in advanced memory technologies and has been working with data center, server, and chipset manufacturers to develop CXL interface technology since the CXL consortium was formed in 2019. The recently unveiled DDR5-based CXL module by Samsung is the industry’s first memory expansion. module supporting the interface. CXL memory expansion technology scales memory capacity and bandwidth well beyond what is commercially available, enabling organizations to meet the much larger and more complex workload demands associated with the AI and other evolving data center applications.
“With the new DDR5-based CXL memory modules, Samsung is helping to lay the foundation for a high-bandwidth, low-latency memory ecosystem designed to support and advance the modern computing age in which the AI+ML is integrated into more daily data center operations,” said Cheolmin Park, Vice President of Global Memory Sales and Marketing at Samsung Electronics. “We are building the CXL ecosystem with Liqid and d others to unlock the unprecedented infrastructure performance needed to deliver breakthroughs in high performance computing for our customers and the industry as a whole.”
Tanzanite’s architecture and purpose-built “Smart Logic Interface Connector” (SLICTZ) enable independent scaling and sharing of memory and compute in a low-latency pool within and between server racks. The Tanzanite solution provides a highly scalable architecture for exa-level memory capacity and compute acceleration.
For Dell Technologies World, the Liqid team collaborated with engineers from Samsung and Tanzanite Silicon Solutions to demonstrate the potential of composable DRAM in real-world scenarios. The lab setup consists of two Archer City systems based on a next-generation Intel Xeon Scalable processor (codenamed Sapphire Rapids), plus Tanzanite’s SLICTZ SoC implemented in an Intel Agilex FPGA, demonstrating clustered/tiered memory allocated across two hosts and orchestrated using Liqid Matrix CDI software.
“We are excited to work with Liqid and Samsung to demonstrate our shared vision of the potential of the CXL protocol with Tanzanite’s industry-leading memory pooling solution, applicable to a wide range of emerging applications such as AI+ ML, blockchain technology and the metaverse,” said Shalesh Thusoo, CEO, CTO and Founder of Tanzanite Silicon Solutions. “As the demo at Dell Technologies World shows, composability for DRAM via CXL has sweeping implications for the how we design our hardware ecosystems for better efficiency, performance and flexibility in a world operating at data center scale.”
The liquid is a paying member of Compute Express Link™ Consortium. Visit the Liquid website or go here to schedule a free infrastructure assessment with an authorized Liquid representative. Follow Liquid on Twitter and LinkedIn to stay up to date with the latest Liqid news and industry insights.
Liquid composable infrastructure software platform, Liqid Matrix™, provides cloud-like speed and flexibility, as well as increased data center infrastructure efficiency. Now IT can configure, deploy, and scale physical bare metal servers in seconds, then reallocate valuable acceleration and storage resources through software as needs change. . Dynamically provision previously impossible systems or scale existing investments, then redeploy resources to where they’re needed in real time. Unleash cloud-like data center agility at any scale and discover new levels of resource and operational efficiency with Liqid.
 Source: Gartner, Predicts 2021: Operational AI and Enabling AI Orchestration PlatformsDecember 2020
See the source version on businesswire.com: https://www.businesswire.com/news/home/20220503005324/en/
Senior Director, Communications
917 224 7769