Meta commits over 6.6 GW of nuclear energy to power its AI data center expansion

The race for Artificial Intelligence is no longer just measured in GPUs, models, and tokens. Increasingly, it’s measured in something much more down-to-earth: solid electricity, available 24/7 and at scale. In that game, Meta (the parent company of Facebook, Instagram, and WhatsApp) has just made a move with agreements that, collectively, exceed 6.6 gigawatts (GW) of nuclear power supply for future AI-focused data centers.

To get an idea of the size of that number: one gigawatt can power hundreds of thousands of homes, so the total volume is roughly equivalent to the electricity consumption of about 5 million homes. It’s not a perfect comparison — data centers demand a very constant power delivery with different peak patterns compared to a residential neighborhood — but it helps visualize the magnitude of this leap.

Why is Meta pursuing nuclear (and why now)

The core reason is simple: AI data centers need continuous and highly available energy. Training models, performing large-scale inference, and supporting real-time services don’t fit well with an overloaded grid or a mix that depends heavily on weather conditions unless there’s sufficient backup and storage.

In this context, Meta isn’t alone: the tech sector is entering a phase where energy becomes the real “bottleneck” for growth. And that’s where nuclear power — for its steady supply and low direct CO₂ emissions — re-emerges as an attractive option for critical loads.

Three agreements, three pieces of the puzzle

The interesting aspect of Meta’s strategy is that it isn’t relying on a single path, but on a combination of existing reactors (more immediate capacity) and new nuclear (more uncertain timelines but expandable).

1) Vistra: leveraging operational plants to accelerate delivery

The agreement with Vistra targets up to 2.6 GW of nuclear supply. The key is that a significant portion comes from already operational assets (and upcoming capacity upgrades), enabling more realistic timelines compared to building reactors from scratch. In such deals, the advantage is clear: if the goal is to sustain short- and medium-term growth, existing assets carry more weight than promising future projects.

2) TerraPower: moving into “new nuclear” with Natrium

Meta has also signed an agreement with TerraPower, the company pushed forward by Bill Gates, to develop capacity based on their Natrium technology. Here, the narrative shifts: it’s about a “next-generation” nuclear that integrates energy storage for load management and increased grid flexibility.

But with a caveat: these types of projects typically aim for the 2030s. That is, they’re a response to the question, “How do I sustain this when my data centers are no longer just campuses, but a constellation of AI factories?”

3) Oklo: microreactors and energy campuses for AI

The third pillar is Oklo, proposing a “power campus” approach with small modular reactors. In theory, this is the kind of solution that appeals to those imagining data centers as industrial complexes that, over time, could resemble large-scale plants: on-site energy, long-term contracts, and modular scalability.

The risk again lies in scheduling and execution. In practice, the market has learned over years that “nuclear projects” and “surprise-free timelines” rarely go hand in hand — even with modular designs.

Prometheus and Hyperion: Meta’s megacenter context

This energy push isn’t occurring in isolation. Meta has been communicating plans for large-scale data centers linked to AI, with projects publicly known as Prometheus and Hyperion. Their electrical needs are now counted in gigawatts. The takeaway is clear: if your roadmap includes facilities aiming for 1 GW or more, it’s no longer enough to “buy energy on the market.” You start to need your own supply strategy.

The uncomfortable part: energy as the new frontier of concentration

An uncomfortable dimension of this story is that when big tech companies start securing huge volumes of firm power via long-term deals, the debate shifts beyond technology to economics and politics.

  • What happens to the rest of the economy when increasing amounts of new capacity are negotiated around hyper-concentrated loads?
  • How does public perception of nuclear change when the focus moves from “national decarbonization” to “fueling AI data centers”?
  • What if the timelines for new nuclear delay, but data centers come online sooner due to business pressures?

Meta’s approach with a mix of existing and next-gen solutions highlights a broader shift: AI is no longer just competing for talent and chips; it’s also vying for energy infrastructure.


Frequently Asked Questions

Why do AI data centers need so much “firm” energy?

Because large-scale training and inference require constant, reliable compute with high availability and stability. Interruptions, network limitations, or intermittent power directly impact performance, costs, and deployment schedules.

What are small modular reactors, and why are they interesting for AI?

SMRs (Small Modular Reactors) promise scalable, modular construction, bringing generation closer to consumption points. On paper, they fit with the idea of “industrial campus” data centers, though their actual adoption depends on licensing, supply chains, and real-world execution.

When might the promised nuclear supply from these agreements materialize?

Existing plants could deliver sooner, but new nuclear projects tend to target the 2030s. Timelines depend on regulation, financing, construction, and operational testing.

Could this impact electricity prices or grid planning?

Indirectly, yes: large long-term contracts influence investment signals, capacity priorities, and planning. The specific impact depends on how agreements are structured and local supply, demand, and regulatory conditions.

Sources: tomshardware and finanznachrichten

Scroll to Top