The Stargate Project – a joint venture by OpenAI, Oracle, and SoftBank – is a $500 billion investment announced by Trump in January toward building, ultimately, 20 such massive OpenAI data centers across the U.S., with Stargate Texas being the first. During the press conference unveiling the plan, Trump pressed Altman, who took the stand next, to speak about how Stargate will cure cancer. It’s not clear what Stargate will actually accomplish as a massive data center, outside of a vague stated mission to “elevate humanity.”
Stargate is the highest-profile case, but it’s part of a larger phenomenon: a surge of AI and cloud storage data centers to Texas, attracted by the only independent energy grid in the country. Texas has always taken a business-friendly approach to the grid, offering cheap energy and often loose regulations on groundwater pumping. Energy and water are what make data centers run, usually 24/7 once they’re turned on. Data centers need water to cool their processing servers, which is actually a more difficult task in hotter states like Texas. They could use air conditioning to do this, but energy is generally a more expensive commodity than water.
When operational, Stargate will use enough energy to power 750,000 homes. To sustain such a huge demand, OpenAI is building its own natural gas power plant to power Stargate. The emergence of these mega data centers that require their own power plants have become another concern for experts on water resources.
The average, midsized data center uses 300,000 gallons of water a day, roughly the use of a thousand homes. Larger data centers might use 4.5 million gallons a day, depending on their type of water cooling system. Austin has 47 such data centers, while the Dallas-Fort Worth area hosts the majority in Texas at 189.
It’s been difficult for HARC and experts like Robert Mace, executive director of the Meadows Center for Water and the Environment at Texas State University, to extract transparent water usage reports from data centers. “Their use could be horrific relative to local use, or it could be extremely minimal,” Mace said.
In a white paper to be released this month, HARC estimates that data centers in Texas will consume 49 billion gallons of water in 2025. They also project that by 2030, that number could rise up to 399 billion gallons, or 6.6% of total water use in Texas.
Most data centers use an evaporative cooling system, in which the servers’ heat is absorbed by water. The heat is then removed from the water through evaporation, causing the water to be lost as vapor in the air. The cooler water then goes back through the machines, and this loop is regularly topped off with fresh water. After all, evaporation renders the water saltier and unusable after four or five cycles. “Then they dump the water, and it goes down the sewer,” Mace said.
This water loss is significant when, even after the devastating flooding earlier this month, nearly a quarter of the state remains in drought conditions.
its OK they got a lot of water from the floods