DataCentreNews India - Specialist news for cloud & data centre decision-makers
Story image

ASRock Rack debuts NVIDIA Blackwell-powered servers with GenAI in mind

Wed, 5th Jun 2024

ASRock Rack has unveiled a range of new servers. These offerings include several servers integrating the NVIDIA Blackwell architecture, aimed at revolutionising computing for large language model (LLM) inference and generative AI workloads.

Taking centre stage is the new ORV3 NVIDIA GB200 NVL72, a liquid-cooled rack server. This ground-breaking system connects 36 NVIDIA Grace CPUs and 72 NVIDIA Blackwell Tensor Core GPUs using NVIDIA's NVLink technology. The result is a formidable single GPU designed to enhance real-time large language model (LLM) inference while reducing total cost of ownership (TCO).

The company also introduced two additional servers—the 6U8X-EGS2 NVIDIA HGX B100 and the 6U8X-GNR2/DLC NVIDIA HGX B200. Both integrate the advanced NVIDIA HGX B200 8-GPU system, with the latter featuring direct-to-chip liquid cooling technology to push the limits of the thermal design power (TDP) for the NVIDIA Blackwell GPUs within a 6U rackmount design. Furthermore, ASRock Rack will soon launch an air-cooled alternative, the 8U8X-GNR2 NVIDIA HGX B200, aimed at customers lacking a liquid-cooling setup. This version will also incorporate the NVIDIA HGX B200 8-GPU system for enhanced real-time inference on trillion-parameter models.

All announced ASRock Rack NVIDIA HGX servers will support up to eight NVIDIA BlueField-3 SuperNICs, allowing them to leverage NVIDIA Spectrum-X, an AI-optimised Ethernet platform. In addition, these systems will be certified for the full-stack NVIDIA AI and accelerated computing platform, which includes NVIDIA NIM inference microservices, part of the NVIDIA AI Enterprise software platform for generative AI.

Weishi Sa, President of ASRock Rack, stated, "We are showcasing data centre solutions powered by the NVIDIA Blackwell architecture for the most demanding workloads in LLM training and generative AI inference. We will continue expanding our portfolio to further bring the advantages of the NVIDIA Blackwell architecture to mainstream LLM inference and data processing."

In collaboration with NVIDIA, ASRock Rack is developing a new product based on the NVIDIA GB200 NVL2 to enhance scale-out configurations and facilitate seamless integration with existing data centre infrastructures. "Working with ASRock Rack, we are offering enterprises a powerful AI ecosystem that seamlessly integrates hardware and software to deliver groundbreaking performance and scalability, helping push the boundaries of AI and high-performance computing," noted Kaustubh Sanghani, Vice President of GPU Product Management at NVIDIA.

Among other showcased innovations was the 4UMGX-GNR2, a dual-socket GPU server compliant with NVIDIA MGX 4U standards. This server supports up to eight FHFL dual-slot GPUs, such as the NVIDIA H200 NVL Tensor Core GPU for mainstream enterprise applications. It also offers five FHHL PCIe 5.0 x16 slots and one HHHL PCIe 5.0 x16 slot to support NVIDIA BlueField-3 DPUs and NVIDIA ConnectX-7 NICs, enabling multiple 200Gb/s or 400Gb/s network connections. Adding to its capabilities, it is equipped with 16 hot-swap drive bays for E1.S (PCIe 5.0 x4) SSDs.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X