The electricity consumption of artificial intelligence data centers is set to reach critical levels, threatening sustainability and operational costs.
The rapid expansion of generative artificial intelligence (GenAI) and the demands for massive processing are causing an exponential increase in electricity consumption by data centers. According to a recent report from Gartner, Inc., by 2027, 40% of AI data centers could face operational restrictions due to energy shortages, jeopardizing the sector’s development.
Electricity Consumption at Record Levels
Gartner estimates that by 2027, AI-optimized data centers will require approximately 500 terawatt-hours (TWh) per year, which represents a 160% increase over current levels. This rise is primarily attributed to the implementation of large language models (LLMs) that support GenAI applications, which demand enormous data processing and storage capacities.
According to Bob Johnson, vice president of research at Gartner, “the explosive growth of new hyperscale data centers to deploy GenAI is creating an insatiable demand for energy, surpassing the capacity of utility providers to expand quickly enough.”
Impact on Costs and Sustainability
The report also warns about the economic and environmental impact of this situation. The growing energy scarcity will inevitably drive up electricity prices, increasing the operational costs of AI models. This rise in costs will be passed on to GenAI service providers and ultimately to end users.
Moreover, the pressure to meet energy demand could jeopardize short-term sustainability goals. The need to keep fossil fuel plants operational that were scheduled for shutdown, along with the lack of reliable renewable energy alternatives for continuous supply, will hinder the achievement of carbon neutrality targets.
David Carrero, a cloud infrastructure expert and co-founder of Stackscale (Grupo Aire), emphasizes: “The challenge lies not only in the rising cost of electricity but also in how we adapt infrastructures to ensure sustainability without compromising efficiency. Companies must invest in more efficient technologies and explore solutions such as edge computing to reduce dependence on large centralized data centers.”
Recommendations to Mitigate Risks
Gartner suggests several measures that organizations should adopt to tackle these challenges:
- Planning for High Costs: Assess the anticipated increases in electricity prices when designing new GenAI-based products and services.
- Long-Term Contracts: Negotiate energy supply agreements at reasonable prices to ensure operational stability.
- Resource Optimization: Minimize the use of computational power to reduce energy consumption and explore alternatives such as smaller language models or edge computing technologies.
- Reassessing Sustainability Goals: Adjust emissions reduction expectations considering the current limitations of renewable sources.
Future Innovations and Perspectives
While short-term solutions may rely on conventional energy sources, Gartner suggests that technological advancements, such as sodium-ion batteries and small nuclear reactors, will be key to achieving sustainability in the future.
The industry must also focus on collaboration with governments and energy providers to ensure a reliable and sustainable supply. As Carrero concludes: “The evolution of AI is intrinsically linked to the ability of data centers to adapt to an increasingly complex energy environment. It’s essential to prioritize responsible innovation so that environmental and economic impact does not hinder technological progress.” In fact, it is advisable to also understand how these new needs are putting stress on the electrical grids of many countries.
With this perspective, industry stakeholders are called to balance innovation, sustainability, and efficiency to ensure a future where AI remains a transformative tool without compromising the planet’s resources.
Source: Gartner