
A boom in artificial intelligence (AI) data centers is proving to be a bust for drought-prone regions.
Water-guzzling tech projects are stretching resources nationwide at a time when the Trump Administration is forging full steam ahead to build out AI infrastructure with the enthusiastic support of Big Tech. Consider: More than 160 new AI data centers have sprouted across the country the past three years in places with high competition for scarce water resources, according to a Bloomberg News analysis of data from the nonprofit World Resources Institute and market researcher DC Byte. That’s a 70% increase from the prior three-year period. And at least 59 more data centers will be built in water-stressed regions in the United States by 2028, Bloomberg reported.
At the projected growth rate of domestic AI expansion, it could consume as much water as 18.5 million households annually just to cool servers.
Such staggering statistics prompted former Google CEO Eric Schmidt to warn, “AI is about to blow out America’s power supply.”
Indeed, a white paper from energy resilience provider Enchanted Rock, “Speed-to-Power Bottlenecks Undermine US AI Dominance & Data Center Revenue,” says grid delays are stalling billions of dollars in AI investments, and interconnection delays for data centers can stretch beyond seven years. AI could add $4.4 trillion to U.S. GDP, it added, but not without adequate power.
Data centers powering AI tools and cloud services lean heavily on a process called evaporative cooling that gobbles up millions of liters of water daily to prevent servers from overheating. A single 100-megawatt data center can consume as many as 2 million liters of water per day, or equal to 6,500 households, according to the International Energy Agency.
Among the chief culprits are Amazon.com Inc., Alphabet Inc.’s Google and Microsoft Corp., which continue to build in hot, dry areas such as Arizona and Texas. Google, Microsoft and Facebook parent Meta Platforms Inc. used an estimated 580 billion gallons of water to provide power and cooling to data centers and AI servers in 2022 — enough water to meet the annual needs of 15 million households.
A Guardian investigation uncovered that from 2020 to 2022, the real emissions from the company-owned data centers of Google, Microsoft, Meta and Apple Inc. were more than 600% higher than officially reported.
What is more, Google-owned data centers discharge just 20% of the water withdrawn to wastewater treatment plants; the other 80% is lost to evaporation.
Stargate, the $500 billion AI project overseen by OpenAI, SoftBank and Oracle Corp., plans a campus in Abilene, Texas. Meanwhile, the Environmental Protection Agency says it is committed to ramping up AI expansion.
While all three tech giants have vowed to be “water positive” by 2030, meaning they will return more water to the environment than they consume, critics scoff at the notion, arguing that “water offsetting” fails to address local shortages. [Microsoft says it has come up with a closed-loop cooling system to avoid evaporation, and it intends to use it in new facilities in Wisconsin and Arizona.]
Fears of climate change calamity, depleted water resources, more pollution and spiked utility bills confront consumers and local lawmakers as Big Tech ambitiously pursues the expansion of AI data centers both domestically and abroad.
AI server and data center energy demand could triple over the next five years, according to a report from Berkeley Lab. By 2028, experts predict annual energy demands would be the equivalent of providing electricity to more than 28 million households, and require as much as 720 billion gallons of water just to cool AI servers.
The effects are visible across the country.
Data centers in Arizona withdraw massive amounts of water in areas where farmers’ fallowed fields went without tap water for most of 2023, according to The Atlantic.
Google’s data centers in Oregon accounted for one-fourth of the water used in The Dalles, a dry area usually off-limits to new industrial water users. The Dalles facility increased water consumption nearly three times between 2017 and 2022, and Google has plans to open two more data centers nearby.
Meanwhile, predominantly Black neighborhoods in Randolph, Ariz., and Memphis, Tenn., face more pollution from gas-powered turbines, which is linked to asthma and lung cancer. Elon Musk’s xAI is operating at least 18 turbines without permits in Memphis.
The issue isn’t just consigned to the U.S.; China, India, Saudi Arabia and the United Arab Emirates are also building data centers in dry regions.
Then there is the notorious energy eater ChatGPT. Searches on the AI chatbot absorb nearly 10 times the electricity as a Google search, says William Becker, a former regional director at the Department of Energy and author of several books on climate change and national disaster policies.
Last year, ChatGPT chewed up more than a million kilowatts of electricity daily, as much as used by 180,000 U.S. households. AI servers could catapult usage 150 times in a decade; by 2028, they would require enough electricity to power more than 28 million households for a year.
“It has become an acute problem: The size of data centers are getting bigger, and capacity is being sucked up,” Allan Schurr, chief commercial officer at Enchanted Rock, said in an interview. “Big Tech is going wherever they can find power and space in secondary markets, but that has been discovered and committed.”
If there are solutions, Schurr has two suggestions: A commitment from data centers to consume 5% less energy during certain times of the year, such as polar vortex, and the use of nuclear power as “a new on-ramp” to energy alternatives.