Credit: The original article is published here.
The AI gold rush is pushing data centers to their limits. While power constraints have long been a challenge, the next major bottleneck is water. Well, the supply of it. As AI workloads skyrocket, traditional air-based cooling systems struggle to keep up. Liquid cooling – particularly seawater-based solutions – is emerging as a sustainable alternative. But can this sustainable approach scale with AI workloads?
AI models are becoming increasingly more complex. Training a model like GPT-4 requires thousands of GPUs running for weeks, consuming vast amounts of energy and generating significant heat. This has led to a sharp increase in data centre power consumption. According to the International Energy Agency, global data center energy demand is expected to reach 1,000 TWh annually, up from 460 TWh in 2022, by 2026. To put that number into perspective, the UK consumed 266 TWh in 2023. Data centers in prime real estate areas are putting considerable strain on local power grids. But for data center structures near the coast, this is where water plays a critical role in efficient cooling.
The data center drought
Traditional cooling methods, such as air-cooled systems and evaporative cooling, are heavily reliant on water. A typical hyperscale data center can consume millions of liters of water daily. This is unsustainable in regions where the availability of groundwater is already stressed. There are already concerns about tech companies drawing significant amounts of water from local supplies, prompting increasing regulatory scrutiny.
Liquid cooling, particularly direct-to-chip and immersion cooling, is gaining traction as a more efficient alternative. The market is expected to reach $48.42 billion by 2034. However, these solutions still rely on vast amounts of fresh water, leading operators to explore seawater-based cooling as a long-term strategy.
Seawater cooling isn’t new, but its application in hyperscale data centers is relatively recent. The principle is straightforward: instead of drawing on freshwater, facilities near coastlines can use seawater as a heat sink. This method significantly reduces reliance on limited freshwater resources while maintaining efficient cooling.
A 1.2GW AI data center campus could employ existing infrastructure from a decommissioned coal plant to implement a closed-loop seawater cooling system. Seawater is drawn through pipes, circulated through heat exchangers, and returned to the ocean under strict environmental controls.
Sink or swim
Seawater cooling significantly reduces reliance on groundwater and municipal supplies, conserving vital freshwater resources. Coastal locations provide unlimited cooling capacity, making this approach highly scalable for tech companies. However, these same coastal locations may present logistical difficulties related to connectivity, power availability, and security concerns.
Liquid cooling is inherently more efficient than air cooling, helping to reduce overall power consumption and improve Power Usage Effectiveness (PUE) scores. Seawater cooling is environmentally compliant and ensures that marine ecosystems remain unharmed as the treated water returns to the sea. However, regulatory hurdles pose significant barriers, as governments enforce stringent rules on the temperature and chemical composition of water returned to the ocean, requiring continuous monitoring and compliance from colocation providers.
Despite its potential, retrofitting existing data centers for seawater cooling is complex and costly, making new-build campuses the most practical candidates for this technology. While the advantages of this approach are clear, addressing these infrastructure and compliance challenges will be critical to its widespread adoption in the run-up to 2034.
Don’t break any dams
We are entering the “Intelligent Age,” with AI-driven transformation across a plethora of industries. The demand for AI is not slowing down. As more organizations deploy AI-driven workloads, data center operators must balance performance with sustainability. The push for renewable energy sources, coupled with advances in cooling technology, will define the next decade of data centre infrastructure innovation.
Seawater cooling is not a white whale, but it’s a crucial step in reducing the environmental footprint of AI data centers. As more operators experiment with large-scale liquid cooling solutions, the industry must work collaboratively to refine best practices and ensure these systems are both sustainable and commercially viable.
For businesses relying on AI-driven insights, the future of data centers matters. Whether through energy efficiency, innovative cooling, or better resource management, sustainable AI infrastructure will shape the digital economy for years and the planet. The question is how quickly the industry can scale these solutions to meet the demands of this AI-powered world.
We’ve featured the best green web hosting.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro