Sure! Here’s the translation to American English:
Since the emergence of ChatGPT in November 2022, artificial intelligence (AI) has transformed entire sectors, demanding increasingly advanced technological infrastructure. With over 300 million weekly users by January 2025, the exponential growth of AI is generating profound changes in the data center industry, from network and cooling system design to energy consumption management.
To understand how these changes impact digital infrastructure and what solutions are being implemented in Europe, David Carrero, co-founder of Stackscale (Grupo Aire), a European company specializing in cloud infrastructure, bare-metalA bare-metal server is a physical server with a single and data centers analyzes the current challenges and the importance of private solutions to optimize performance and security in this new paradigm.
AI Imposes New Requirements in Data Centers
AI workloads require unprecedented computing power, which introduces challenges in networking, energy consumption, and cooling. Data center operators are adjusting their infrastructures to meet these new requirements.
- Low-latency networks: “AI models need ultra-fast connectivity. While traditional data centers have relied on Ethernet, more companies are adopting InfiniBand networks to reduce latency in high-performance clusters,” explains Carrero.
- High power density: “Racks running AI workloads can exceed 70-80 kW per rack, posing challenges in electrical distribution, power supply, and cooling for data centers,” he warns.
- Infrastructure designs optimized for AI: “In AI environments, rack configurations need to adapt, prioritizing fewer racks per room but with higher computing density to improve operational efficiency, where now, as we reach 10-15 kW, we will move to over 40 kW per rack,” details the co-founder of Stackscale.
Despite these challenges, private cloud and bare-metal infrastructures offer an optimal alternative for companies seeking guaranteed performance, advanced security, and greater control over their data.
Liquid Cooling: Key for AI Data Centers
One of the greatest challenges for data centers hosting AI workloads is cooling. While air cooling remains viable, the increase in power density is driving the adoption of liquid cooling.
Carrero explains: “Liquid cooling will be essential for managing high-density racks. Many solutions are gaining traction and will be key in AI data centers.”
The Impact of AI Growth on Energy Consumption

AI has multiplied energy consumption in data centers, generating challenges in infrastructure management. To address this demand, operators are implementing strategies such as:
- Energy design optimization, increasing electrical capacity without compromising stability.
- Use of renewable energy sources, reducing the environmental impact of the growing demand for computing. This may also involve next-generation nuclear energy sources to ensure constant production.
- Redefining SLAThe Service Level Agreement (SLA) in English is known as (Service Level Agreements), adapting them to the new cooling and power requirements.
Carrero points out that access to sustainable and efficient energy is a key factor in the evolution of data centers in Europe. “At Stackscale, our data centers in Madrid and Amsterdam use 100% renewable energy and are optimized to ensure efficient consumption in high-performance loads,” he emphasizes.
Private Solutions: Boost Performance and Security in the Age of AI
As AI advances, companies are seeking infrastructure that offers high performance, low latency, and maximum security. While large hyperscalers have dominated the cloud infrastructure market, private and bare-metal solutions are gaining prominence due to their multiple advantages:
✅ Greater control over data: “Companies managing AI workloads need dedicated environments where data access is exclusive and not shared with third parties,” explains Carrero.
✅ Optimized performance: “Stackscale’s bare-metal solutions allow resources to be assigned without virtualization overhead, maximizing computing power for the efficient development of AI solutions,” he adds.
✅ Advanced security: “By operating in private environments, companies can ensure regulatory compliance and protection of sensitive data without relying on shared infrastructures,” he states.
✅ Predictable costs: “Compared to the volatility of costs in public cloud, private environments offer a more stable and predictable cost structure, allowing companies to scale without surprises in billing,” he points out.
Stackscale’s data centers in Madrid and Amsterdam are designed to support AI workloads with high-speed networks, 100% renewable energy, and scalable infrastructure for businesses that require total control over their data and resources.
Will There Be Exclusive AI Data Centers in the Future?
The future of data centers in the age of AI is still evolving, but all signs point to traditional centers and those specialized in AI following separate paths.
Carrero highlights that if AI demand stabilizes at density ranges of up to 100 kW per rack, current data centers can adjust. “However, if workloads continue to grow, dedicated data centers for AI will be necessary, optimized for ultra-low latency networks and advanced liquid cooling,” he states.
In the next three years, more specialized AI facilities are expected to emerge, especially if companies can effectively monetize their artificial intelligence solutions.
“The growth of Artificial Intelligence is redefining the data center sector. Companies seeking to optimize their workloads must choose infrastructures that offer security, performance, and energy efficiency. At Stackscale, we advocate for private solutions that allow our clients to maximize control and efficiency of their AI projects without relying on the public cloud,” concludes Carrero.