Giga Computing, a subsidiary of GIGABYTE focused on high-performance enterprise solutions, has announced the launch of four new servers optimized for AI workloads, all based on the NVIDIA HGX B200 platform. These models combine advanced architecture with liquid cooling and air cooling options, strengthening Giga Computing’s position in the race for energy efficiency in data centers.
The new generation of servers incorporates the NVIDIA HGX B200 platform, built upon Blackwell GPUs, capable of performing real-time inference 15 times faster on models with trillions of parameters. This platform represents a significant evolution for infrastructure that needs to scale generative AI solutions and large language models (LLMs) without compromising thermal or electrical performance.
Giga Computing introduced two models featuring direct liquid cooling (DLC) in 4U format aimed at maximizing AI compute density:
– G4L3-SD1-LAX5: compatible with Intel® Xeon® processors
– G4L3-ZD1-LAX5: compatible with AMD EPYC™
These servers feature separate chambers for CPU and GPU, designed to optimize heat flow and improve thermal efficiency. They are ideal for environments requiring intensive large-scale AI training, reducing electricity consumption and cooling costs.
For those seeking air-cooled options with high adaptability to x86 platforms, Giga Computing launched:
– G894-AD1-AAX5: supports Intel® Xeon® 6900 Series, including Xeon® 6962P
– G894-SD1-AAX5: supports Xeon® 6700 and 6500 Series, including Xeon® 6776P
Both models are prepared for the HGX B200 platform and feature architecture that facilitates integration into existing data centers.
All these servers are integrated into GIGAPOD, Giga Computing’s rack-scale supercomputing solution designed specifically for training generative AI models with billions of parameters. This solution combines high-performance GPUs, high-speed networks, and a scalable architecture, offering:
– Higher performance per watt
– Optimized thermal design
– Advanced cooling options
The inclusion of the new HGX B200 servers enables GIGAPOD to adapt to the changing demands of AI workloads, ensuring sustainability and scalability.
This announcement underscores NVIDIA’s strategic role as a driving force in global AI infrastructure, especially as giants like Microsoft, Meta, and Amazon accelerate investments in foundational models. It also highlights GIGABYTE’s commitment to advanced cooling technologies as a competitive advantage in energy sustainability.
For investors, this launch signals growth in a highly specialized AI server market where thermal efficiency and modularity are crucial. Meanwhile, integrated solutions like GIGAPOD are emerging as solid alternatives for data center operators seeking scalability without sacrificing efficiency.