Offshore Engineering
Underwater Data Centers Were Never the Answer

Youp Overtoom
Marketing Director

Underwater Data Centers Were Never the Answer
Microsoft ends its subsea data center experiment, confirming that scalable compute needs energy alignment, not exotic locations.
What Happened
Microsoft has officially confirmed that Project Natick, its experimental underwater data center initiative, is no longer active. The project began in 2013 as an internal research concept and advanced through two phases, the second of which involved submerging 855 servers off the coast of Scotland in 2018 for over two years. The subsea capsule demonstrated impressive reliability, with a failure rate of just 0.7% compared to 5.9% in a parallel land based experiment.
Despite these results, Noelle Walsh, head of Microsoft's Cloud Operations and Innovation division, confirmed the project would not continue into commercial deployment. Microsoft stated it would carry the learnings from Natick into other areas, including liquid immersion cooling and insights around sealed, controlled environments for server longevity. The company also open sourced several related patents.
Structural Context
Project Natick was conceived during a period when data center cooling costs and environmental sustainability dominated strategic planning. The logic was appealing: place servers in naturally cool ocean water, reduce energy consumption for thermal management, and locate infrastructure near coastal populations. At the time, this represented a creative attempt to solve what was primarily a cooling problem.
The landscape has shifted. Today, the binding constraint on data center deployment is not cooling or proximity to users. It is access to power. As AI workloads have surged, power requirements per rack have grown dramatically, and the infrastructure sector has moved from asking "how do we cool compute?" to asking "where do we find the energy to run it at all?" Subsea capsules, however elegant from an engineering perspective, do not solve this newer and more fundamental challenge. They are difficult to service, expensive to deploy, and do not address the structural gap between energy supply and compute demand.
Meanwhile, other approaches to cooling, particularly direct liquid cooling and immersion systems, have matured rapidly and can be deployed at scale in conventional or modular facilities. The industry has absorbed the thermal lesson without needing to go underwater.
The Enki Perspective
The story of Project Natick is ultimately a story about infrastructure logic evolving faster than any single project can keep pace with. When Natick was designed, the problem was thermal. Now, the problem is electrical. That shift changes everything about where and how data centers should be built.
Project Enki's model begins where Natick's ended, not with the question of how to cool servers, but with the question of where energy is available and underutilized. Stranded and curtailed renewable energy represents a vast and growing pool of capacity that the grid cannot absorb. Rather than engineering exotic enclosures for remote deployment, the more effective path is to bring modular, energy aligned compute infrastructure directly to the point of generation. This approach solves for power access, reduces dependence on constrained transmission networks, and shortens deployment timelines from years to months.
The reliability findings from Natick are genuinely valuable. Sealed, controlled environments with minimal human intervention produced better server outcomes. Those insights translate directly into the kind of modular, autonomous deployment models that energy aligned infrastructure already favors. The lesson is not that subsea is wrong. The lesson is that the same principles work better on land, closer to generation, and at a fraction of the cost and complexity.
What This Signals
Microsoft's decision to close Project Natick reflects a broader recalibration across the industry. The era of experimental, location driven infrastructure concepts is giving way to a more pragmatic focus on energy access and deployment speed. Hyperscalers are not short on demand or capital. They are short on power, and on the grid capacity to deliver it.
This shift creates structural opportunity for infrastructure models that can operate outside the traditional grid delivery chain. Modular facilities sited at or near renewable generation sources offer a credible path to scaling compute without waiting for transmission upgrades that take a decade or more. Institutional capital increasingly recognizes this, favoring repeatable and energy sovereign deployment models over bespoke engineering experiments.
The future of AI infrastructure will not be found at the bottom of the ocean. It will be found wherever clean energy is generated and currently wasted, converted into compute capacity that the world urgently needs.
Bron: Sebastian Moss, 2024 https://www.datacenterdynamics.com/en/news/microsoft-confirms-project-natick-underwater-data-center-is-no-more/
Explore compute at the source of power
Project Enki B.V. | a TJYP Venture
Chamber of commerce: 98681036
Explore compute at the source of power
Project Enki B.V. | a TJYP Venture
Chamber of commerce: 98681036
Explore compute at the source of power
Project Enki B.V. | a TJYP Venture
Chamber of commerce: 98681036



