OpenAI’s Colossal Energy Plan: As Much Electricity as India and a Larger Carbon Footprint Than ExxonMobil

The race for generative artificial intelligence is not only fought on benchmarks and increasingly sophisticated models. It is also rapidly shifting into the most physical realm possible: gigawatts of electrical capacity, millions of chips, and dozens of new semiconductor factories.

According to an investigative report published in Truthdig, OpenAI has an ambition as ambitious as it is concerning: reach up to 250 gigawatts (GW) of installed computing capacity by 2033. The figure appears in a supposed internal memo dated September 2025, signed by Sam Altman himself, the company’s CEO.

To grasp the scale: those 250 GW would roughly equal all the electricity consumed by India today, a country of 1.5 billion people. And, when translated into emissions, the associated deployment could double ExxonMobil’s annual carbon footprint, regarded as the world’s largest private emitter of CO₂.


250 GW of AI: a country-sized infrastructure

The capacity of a large data center is measured in megawatts (MW), or in extreme projects, in gigawatts. For years, talking about a 1 GW campus already sounded enormous. The plan attributed to OpenAI goes much further: 250 gigawatts distributed in less than a decade.

Behind that figure are several layers:

  • Electricity: 250 GW operating continuously demand an energy load similar to entire economies. It’s not just a peak, but a nearly constant load, 24/7, 365 days a year.
  • Emissions: in a world where a large part of electricity generation still relies on fossil fuels, such increased demand would entail hundreds of millions of additional tons of CO₂ unless paired with a massive rollout of renewables and storage infrastructure.
  • Infrastructure: the current grid — substations, high-voltage lines, cooling systems, water supply — is not designed to absorb several “digital countries” like this overnight.

The memo from Altman, cited in the report, presents this goal as a “bold” long-term target. But the scale suggests something beyond technological ambition: it’s a redesign of the global energy infrastructure to serve AI.


60 million GPUs and 10 “Fab 25” factories to feed the monster

To sustain those 250 GW, the report’s author makes another calculation: it would require, for OpenAI alone, at least 60 million Nvidia GB300 GPUs running in parallel in large server farms.

If we assume a lifespan of about two years — not so much because the hardware physically “dies,” but because it loses economic value compared to newer generations — the figure implies manufacturing and deploying around 30 million GPUs every year just to keep the fleet updated.

This demand would have an immediate impact on the supply chain:

  • A single “mega-fab,” like Fab 25 from TSMC in Taichung (Taiwan), will be able to produce around 3 million GPUs annually when running at full capacity.
  • Meeting OpenAI’s plan alone would require at least 10 such fabs, dedicated entirely to AI chips, alongside memory factories and packaging plants.

And OpenAI isn’t alone: other major tech companies — at least five more, according to the report — also possess capital and plans to build multi-gigawatt data centers for their own AI models. The pressure on semiconductor manufacturing and resource consumption extends well beyond a single company.


Taichung and the Fab 25 case: 7% of a city’s water for AI chips

To illustrate the impact of this chain, the report focuses on Fab 25, TSMC’s most advanced plant, whose construction began in 2025 on the outskirts of Taichung:

  • It will consume around 100,000 tons of water daily, approximately 7% of the municipal consumption of a city of 2.8 million people.
  • In a country like Taiwan, which faces recurring droughts and changing rainfall patterns linked to global warming, this extra demand creates direct conflicts with agriculture.

During the last major droughts (2021 and 2023), authorities mandated reductions of 10-15% in water use for semiconductor factories. Still, thousands of rice farmers in the south went without irrigation for multiple seasons, effectively displaced by the chip industry.

The water issue is compounded by energy needs:

  • Fab 25 will require around 1 gigawatt of power, similar to the annual consumption of 750,000 urban households.
  • Most of Taiwan’s electricity still comes from coal and gas plants, with a high carbon footprint.
  • Many industrial gases used in lithography, like sulfur hexafluoride (SF₆), have global warming potentials thousands of times higher than CO₂.

Fab 25 is just one piece of an expanding puzzle: each new chip generation demands more energy and water than the previous one, due to increased process complexity and tighter tolerances.


South Korea: Samsung’s mega-cluster and factories consuming half Seoul

The shift toward AI is also reshaping South Korea’s industrial landscape. Samsung, aiming to bridge the gap with TSMC, plans a mega-chip cluster in Yongin, south of Seoul, which the report describes as even larger than the Taiwanese project.

Local estimates suggest that:

  • Just that complex could consume more than half of Seoul’s current water usage.
  • Their electricity demand would be around 16 gigawatts, about one-sixth of the country’s total consumption.

The impact extends beyond water and power. South Korea’s semiconductor industry has a history of occupational illnesses and exposure to carcinogens, according to workers’ groups. The report mentions documented cases of leukemia, brain tumors, and other cancers among plant employees, as well as conflicts over transparency regarding risks.


U.S. and the “resurgence” of toxic waste

With the CHIPS Act, the U.S. is promoting the construction of over twenty new semiconductor factories to reduce dependence on Asia. Some notable examples include:

  • Fab 21 at TSMC in Phoenix, Arizona.
  • An Amkor chip packaging plant in Peoria, also in Arizona.
  • A planned SK Hynix memory factory in West Lafayette, Indiana, key for high-bandwidth memory used in AI GPUs.

These developments raise similar concerns:

  • High water consumption in already drought-stricken regions.
  • Extensive use of PFAS (“forever chemicals”) that persist and accumulate environmentally.
  • Increased heavy truck traffic transporting toxic substances and waste through residential areas.
  • Labor conditions that, according to labor unions, fall short of the “clean, well-paid jobs” image often portrayed by the sector.

Beneath these plants lies a familiar story in Silicon Valley: decades of contaminated sites designated as Superfund due to early chip manufacturing waste, with current expansion risking repeating the pattern elsewhere in the country.


More mining, more waste, and local conflicts

The report also highlights that the problem stretches beyond each industrial site. AI and next-generation chips increasingly demand:

  • Critical minerals like copper, nickel, and rare earth elements.
  • New mines in remote or fragile ecosystems, often on indigenous lands.
  • Logistics for mining, processing, chip manufacturing, and electronic waste management.

The International Energy Agency forecasts that global demand for critical minerals could quadruple by 2040, driven by AI, digitalization, and renewable technologies. Many extractions will occur in areas that today serve as carbon sinks or biodiversity refuges.

Even companies reporting sustainability focus often provide fragmented data on water, emissions, discharges, PFAS, or labor conditions, making it hard to assess the full impact, according to organizations consulted in the report.


AI, energy, and the planet: the uncomfortable question

Since 2020, many tech firms have touted “net-zero” and sustainability goals. But the rapid shift toward generative AI has put those plans under intense pressure: the same leaders who pledged to cut footprints are now competing to build the largest data centers.

The supposed goal of OpenAI reaching 250 GW by 2033 encapsulates this tension:

  • Achieving it would require reconfiguring large parts of the worldwide electrical infrastructure and building dozens of new mega-factories, with substantial water, energy, and chemical use.
  • Even with efficiency and recycling efforts — like those implemented by TSMC or Samsung — the rapid expansion only slows the growth of impacts, not stops it.
  • And OpenAI is not alone: other major AI players are pursuing similar massive computing goals.

While Silicon Valley CEOs calculate how many gigawatts and GPUs they need to “power” the next generation of models, the report’s deeper question remains: how much additional AI can the planet truly sustain?

The answer today isn’t found in any internal memo. It likely depends not only on technological capabilities but also on societal, regulatory, and community willingness to accept impacts on water, energy, health, and land.

References: truthdig and tomshardware

Scroll to Top