Sure! Here’s the translation:
—
The new space race is taking place in data centers. The expansion of AI demands facilities with higher density, energy efficiency, and ultra-fast connectivity.
The unstoppable rise of artificial intelligence is redefining the architecture of data centers. What seemed like a pipe dream just a few years ago—high-density infrastructures to support colossal computational loads—has become an urgent necessity. With increasingly large and demanding language models, the tech industry has fully embraced building data centers specifically designed for AI.
The New Golden Age of Computing
Since the 2020 study “Scaling Laws for Neural Language Models” confirmed that AI models scale their capabilities as they increase in size, data, and computational power, the trend has been clear: build gigantic data centers filled with GPUs. However, pre-training is no longer enough. New post-training techniques, such as reinforcement learning with automated feedback—pioneered by DeepSeek—and inference-time computing, are driving a new generation of models with adaptive reasoning.
All these techniques share a common denominator: an increasing demand for computational resources and, therefore, data centers capable of housing that computational power.
From Empty Racks to Super Density
Traditionally, AI infrastructures have been integrated into conventional data centers, but that is beginning to fall short. According to the Uptime Institute, the global average density per rack is less than 6 kW, while GPUs like the NVIDIA H200 demand over 40 kW per rack. The result: half-empty racks, wasted space, and an inefficient system.
Today, densities in AI data centers are already measured in tens or even hundreds of kW per rack. This leap comes with a critical need to reduce latency between GPU nodes and maximize local bandwidth, requiring extreme physical proximity between servers and high-speed networks designed for optimal performance.
A paradigmatic example is the xAI Colossus Cluster, the world’s largest AI supercomputer, with 100,000 GPUs and four data rooms. Built in just 122 days, it incorporates direct liquid cooling to the chip and active rear doors, avoiding traditional thermal containment systems.
Extreme Cooling and Energy Storage
The Colossus, unveiled by Supermicro with the approval of Elon Musk, also demonstrates how solutions like Tesla Megapacks are being used to cushion milliseconds-long spikes in power consumption during intensive training processes, offering unprecedented reliability.
The architecture combines CDU units with redundant pumps, custom liquid cooling blocks, and up to nine network ports per server for optimal connectivity between GPU and non-GPU clusters.
A Global Race for AI Supremacy
The United States leads the construction of AI data centers, with projects like the Stargate from OpenAI, SoftBank, and Oracle, which anticipates an investment of up to $500 billion. Amazon plans to allocate $100 billion to tech infrastructure—mainly for AI—while Google will invest $75 billion and Microsoft $80 billion just this fiscal year. Meta is not far behind, with up to $65 billion dedicated to training its Llama models.
And this race has only just begun.
The Biggest Challenge: Energy
If there were no budgetary or component restrictions, what stands in the way of building these data centers? The answer is clear: energy. Training gigantic models requires increasingly more electricity, forcing companies to make long-term agreements with energy providers, including nuclear plants and renewable sources. Energy sustainability is undoubtedly one of the major challenges to address.
Conclusion: The data center of the future for AI will not only depend on the latest generation of GPUs or ultra-fast networks. It will be an infrastructure where every watt counts, where thermal design, energy efficiency, and scalability are as important as computational power. Success in the race for supremacy in artificial intelligence will depend as much on the ability to innovate as on the ability to build sustainably.
Source: w.media
—
Let me know if you need any further assistance!