NVIDIA has announced at GTC its most advanced enterprise infrastructure to date: NVIDIA DGX SuperPOD, equipped with Blackwell Ultra GPUs, designed for companies in any sector to build AI factories ready to develop applications for agentic reasoning, generative AI, and physical AI.
The new solution, composed of the NVIDIA DGX GB300 and DGX B300 systems, incorporates high-speed networking and is designed to deliver unprecedented performance in training and scaling complex inference tasks.
Supercomputing for Industrial-Scale AI Factories
The DGX GB300 system, liquid-cooled, integrates 36 Grace CPUs and 72 Blackwell Ultra GPUs in a rack-scale design that enables real-time responses in advanced reasoning models. Thanks to fifth-generation NVLink technology, these GPUs function as a single massive shared memory, capable of handling extreme computational processes.
Each DGX GB300 achieves up to 70 times more AI performance compared to systems based on the previous Hopper architecture and features 38 TB of ultra-fast memory, ideal for multi-step reasoning applications and complex inference environments.
On the other hand, the DGX B300 system, air-cooled, brings the power of generative and reasoning AI to conventional data centers, with performance up to 11 times greater in inference and 4 times higher in training compared to the Hopper generation. Each system includes 2.3 TB of HBM3e memory and accelerated network connections with 8 SuperNICs ConnectX-8 and 2 DPUs BlueField-3.
NVIDIA Instant AI Factory: AI Without Waiting
For companies looking to deploy these infrastructures without lengthy planning processes, NVIDIA has launched NVIDIA Instant AI Factory, a managed service alongside Equinix, which will offer preconfigured DGX SuperPODs with next-generation AI capabilities, ready for use in over 45 international markets.
This solution will allow organizations to have turnkey AI factories, eliminating the months typically required for planning and configuring supercomputing infrastructures for training models and real-time reasoning workloads.
Advanced Orchestration and Management Software
To facilitate the management of these environments, NVIDIA has also introduced Mission Control, its new orchestration platform for data center operations with DGX systems based on Blackwell. Additionally, the DGX SuperPOD systems support the NVIDIA AI Enterprise platform, which includes NIM microservices, frameworks, libraries, and tools to ease the development and optimization of enterprise AI agents.
Among the highlighted features is the integration with the new family of open reasoning models, Llama Nemotron, which will provide advanced inference capabilities for autonomous AI agents in productive environments.
Availability
The NVIDIA DGX SuperPOD with DGX GB300 and DGX B300 systems will be available through NVIDIA partners throughout this year. Additionally, NVIDIA Instant AI Factory is expected to be available by the end of 2025.
Companies that wish to learn more about this next-generation AI infrastructure can access NVIDIA’s keynote and register for GTC sessions, taking place until March 21.
Source: Nvidia