The race to power Artificial Intelligence is pushing digital infrastructure into increasingly unconventional territories. After years of discussions about data centers near dams, solar parks, nuclear plants, or even space stations, a startup from Oregon proposes an alternative: moving part of the computing to the sea and harnessing wave energy to perform inference tasks.
Panthalassa has closed a $140 million Series B round led by Peter Thiel, co-founder of PayPal and Palantir, to advance the manufacturing and deployment of its autonomous ocean computing nodes. The company plans to complete a pilot plant near Portland and deploy its Ocean-3 series in the North Pacific in 2026, with a view toward commercial deployment in 2027.
AI computing without bringing energy ashore
Panthalassa’s idea breaks away from the traditional data center model. Instead of building on land, connecting to the electrical grid, and solving cooling challenges afterward, the company envisions floating nodes capable of generating electricity from wave motion and using it directly onboard to power AI chips.
The electricity would not be sent to the grid. It would be used within the node itself. The results of inference tasks, which are much lighter than what is needed to generate them, could then be transmitted ashore via low Earth orbit satellites. According to the company, these systems are designed to operate in remote ocean zones where wave energy is denser and more consistent.
This approach aligns with a growing push. AI data centers require vast amounts of energy, cooling capacity, industrial land, permits, grid connections, and local acceptance. In several regions, building new facilities already faces opposition due to their impact on the electrical grid, water use, landscapes, or nearby communities.

Panthalassa aims to sidestep some of these tensions by moving the problem offshore. Its floating nodes would be manufactured in steel at coastal facilities and operate autonomously in the ocean. The company states that its Ocean-1, Ocean-2, and Wavehopper prototypes, tested in 2021 and 2024, validated technologies for energy generation, propulsion, autonomy, and oceanic computing.
Why inference fits better than training
The proposal doesn’t seem intended to replace massive data centers that train foundational models. Training requires huge clusters, ultra-low latency networks, synchronization across thousands of accelerators, high-bandwidth memory, and tightly coordinated operations. Bringing that complexity to the sea would be much more challenging.
Inference, on the other hand, may be better suited. Once a model is trained, many tasks involve running queries, generating responses, classifying data, processing documents, or producing tokens. These loads can be distributed more easily if designed correctly. In this scenario, an ocean node wouldn’t need to participate in a giant training cluster but would process requests and return results.
Panthalassa’s own announcement refers to “inference computing at sea.” The distinction is important. Moving a complete data center offshore is different from creating a distributed fleet of nodes for specific AI tasks. If the energy cost is low and cooling is managed by the marine environment, the economic case can become attractive for certain workloads.
Cooling is one of the core arguments. AI chips generate significant heat, and dissipating it ashore requires complex cooling systems, sometimes involving additional water or energy. At sea, the ocean’s thermal mass offers a natural cooling source. Panthalassa claims this can help address one of the biggest engineering challenges of terrestrial data centers and extend chip lifespans.
An ambitious idea with many challenges ahead
The appeal of this concept is clear, but so are the challenges. The ocean is a harsh environment for electronics, mechanics, and maintenance. Corrosion from salt, storms, biofouling, extreme waves, structural fatigue, physical security, and equipment repairs far from shore can significantly complicate operation.
There are also regulatory issues. Operating autonomous infrastructure in open waters requires permits, maritime coordination, insurance, environmental regulations, and safety guarantees. As the fleet of nodes grows, questions about monitoring, incident response, and impacts on navigation, marine life, and fishing activities will need to be addressed.
Satelite connectivity is another critical piece. In inference tasks, sending results back can be feasible, but reliance on satellite links introduces latency, costs, availability issues, and potential capacity limits. For some AI uses, this may not be problematic. For others—especially those requiring ultra-fast responses or continuous transfer of large data volumes—it could be a barrier.
Manufacturing is another consideration. Panthalassa plans to complete a pilot plant near Portland to produce these systems at scale. Moving from prototypes to repeatable manufacturing, industrial maintenance, and commercial deployment is a difficult transition. The fact that the round included investors like Peter Thiel, John Doerr, TIME Ventures, SciFi Ventures, Fortescue Ventures, and Supermicro lends financial credibility, but does not eliminate technological risk.
The ocean as a new frontier of computing
This investment also carries a broader interpretation. AI infrastructure is running out of easy solutions. The demand for compute outpaces electrical capacity in many locations. New data centers compete for energy, transformers, land, permits, chips, memory, cooling, and connectivity. That’s why proposals that seemed marginal a few years ago are gaining attention now.
Google has researched long-term space-based data centers, Elon Musk has mentioned off-Earth options, and various companies explore submarine cooling, modular nuclear energy, or deployments near renewable sources. What sets Panthalassa apart is that it doesn’t try to bring energy from the sea to land but instead takes the compute to where the energy is.
Peter Thiel summarized this concept with a striking phrase: “extraterrestrial” solutions for computing are no longer science fiction, and Panthalassa has opened “the ocean frontier.” While the tone of the project is ambitious, the reality will be less epic and more industrial: costs per token, maintenance, availability, energy efficiency, and production capacity.
If successful, Panthalassa could pioneer a new category of distributed AI inference infrastructure. It wouldn’t replace large data campuses but could ease some of the pressure on power grids and local communities—especially for workloads that do not need to be within urban areas or connected to sensitive internal systems.
The real question isn’t whether the ocean has enough energy. It’s whether that energy can be reliably, maintainably, and cost-effectively converted into compute. That’s where the future of Panthalassa will be decided. Its $140 million funding round provides the resources to try. The North Pacific will now be its testing ground.
Frequently Asked Questions
What does Panthalassa want to do?
Panthalassa is developing autonomous floating nodes that generate electricity from waves and use it to perform AI inference tasks at sea.
How much funding has the company raised?
The startup has closed a $140 million Series B round led by Peter Thiel, with participation from other tech, industrial, and climate investors.
When will the first nodes be deployed?
They plan to deploy their Ocean-3 pilot series in the North Pacific in 2026, with commercial deployments starting around 2027.
Will this replace traditional data centers?
Not in the short term. The approach seems more suited for distributed inference loads rather than large-scale model training. It could complement terrestrial infrastructure but is unlikely to fully replace it.


