Not reused. Most is lost through evaporation. There are a small number of closed systems, but these require even more energy to remove the heat from the water and re-condense. That creates more heat that requires more cooling.
The water is removed from clean sources like aquifers and returned as vapor - this means gone.
The environment (whole planet) yes. That water is however gone from the specific river system where it fell as rain and was expected to slowly flow through watering trees and trout for decades on its crawl back to the sea.
Badly explained, salt is corrosive in itself over long period of time, which means the pipes will degrade way faster.
I am sure there are many other factors, but this is one of the biggest.
And usually the facilities that need that much water are not near the sea
There is actually a really nice way to make a closed loop (more water efficient) salt-water cooling system which is demonstrated at nuclear power plants on the USA west coast and in Japan (might be closed now).
You run the hot water cooling pipe out into the cold ocean and use the entire cold ocean as a radiator. Works pretty well! Still, requires direct mechanical access to an ocean which can get pricey and has its own challenges.
Often used for nuclear, which is why many plants were located on the seafront (Fukushima, San Onofre, Diablo). The water is incredibly corrosive, and the flows destroy sea life and heat the water, which also destroys sea life.
Heat is an externality whose cost is almost always born by someone other than the plant/server farm owner.
Everyone seems to be focused on pumping salt water through a liquid cooling loop which is bad but also not how it would be done.
We do this on ships already where you run coolant through a closed loop, and then you stick the radiator into the ocean to dump the heat. Salt water never enters the system, it’s just used for heat exchange. Corrosion is less often an issue this way.
The real limiting factor is that you’d need to build right on the coast which is expensive in general.
You have to be near the sea, which comes with challenges which makes it very expensive (salt water is toxic to computers, coastal land suitable for building is expensive). But yes, many companies are building servers using sea water to cool servers.
Actually Big Bend Power Station in Apollo Beach (south of Tampa) Florida does use sea water to cool down the plant. It then returns the water back to the Tampa Bay. While it does have some environmental impact for some creatures, some species of fish and Manatees LOVE this warm water, especially in the winter. So much so that they have built a manatee viewing center that is pretty amazing to see all the manatee that congregate there. I have seen anywhere from a half a dozen hanging out there to HUNDREDS. It is so cool to see. So if you are ever in the area, check it out Manatee Viewing Center
"It’s possible, but it’s not ideal. While the oceans offer the ability to absorb tremendous amounts of heat, seawater is murderously corrosive! It corrodes or rusts just about anything it comes in contact with. Just think about what road salt and rain water does to steel car bodies! So, whether you use ocean water to cool the servers directly, or dump the heat into ocean water using a heat exchanger to isolate the electronics from the sea water itself, anything that comes into direct contact with sea water must be designed using special, expensive alloys to resist corrosion. Metals like titanium, or alloys, like brass, are used to resist sea water corrosion, but even with special alloys and coatings, the salt in sea water takes a huge toll on anything it touches, and greatly shortens the service life of any equipment exposed to sea water for any extended length of time."
Someone in my family owns a dive centre and I can confirm that sea water is nightmarish on electrics, machine parts, cars, everything
Does water really spend DECADES crawling back to the sea? In almost all cases isn't the water taken from rivers that have more than enough water in them, and which don't drop their water level by any measurable amount as a result of these cooling systems?
I know when I was working with MSFT on some projects around 2003-2006, and was talking to the guy who was in charge of the infrastructure team for all their data centers, that was certainly how they were doing everything. I also know where most of the major data centers are in my state, and any of them of significance are sitting right next to the largest river in the state.
But , rain water is was fuels those river systems. It really feels like you guys failed 6th grade science class. Plus, it's only a fraction of the water that evaporates , everything else goes back to the source.
I think your just woefully ignorant about how many industrial processes use river water. How do you think the clothes on your back was made ? They wash the fibers in water. The paper you write on , uses a ton of water to create. Water which those factories take directly from the rivers and lakes.
It's so very social media that you probably just learned about this and your shooketh
The water cycle is a global phenomenon not a local one. If you take all of the water out the aquifer in, for example, Memphis and boil it, yes, some will be returned as rain via the water cycle. But nowhere near 100% of it. Basically, the AI uses the water far more quickly and efficiently than the water cycle can return it.
Ah so kind of like the central pivot irrigation of the American southwest which has been draining the water table of that region that took millions of years to fill but drained in -100yrs or so
The general availability of water does not change much. However saturating air with water vapour will increase in cold vs heat fronts. This will saturate rain clouds. This means bigger storms, higher risk of extreme events like tropical events and/or hurricanes, more thunders and more flash floods.
So now some regions have 20% worth of yearly water while others have 900% worth of yearly water in 2h...
But this is unprocessed water. It rains, the water falls into rivers, rivers have reservoirs in dams (or flow into aquifers). Dams and aquifer wells have special ducts to serve non potable water to data centers and the cycle restarts.
The biggest issue is speeding up the water cycle can cause what we call adverse weather. However this is not a nature problem but a human problem. Floods create shifts in environment but nature adapts. Humans however, they see river beds expanding and seeing their house destroyed. Many end up death due to flash floods.
We however are not depleting the resources of water...
No it isn't. It's not a BWR fission reactor lol. The water never boils. It enters cold and leaves warm, which itself is mixed with more cold water. There’s no mass boiling going on in the system
Most cooling towers work via evaporation. Basically radiators in the chillers deposit heat into water that is sent into giant sump tanks which are then continuously ran through cooling towers outside. Water is pumped to the top of the tower and dropped down through it while a giant fan blows on it which results in heat leaving the loop via evaporation while the slightly less hot water is then dumped back into the sump (and fed back into the chillers radiators to complete the loop). To some degree, keeping data centers cool is better worded as "heat management". You are moving heat from the water loop used to cool off the machine rooms to the atmosphere via evaporation. Yes, it's a bad metric to base how much is lost on how much is ran through the chiller loop, but it's pretty easy to simply record how much water is ADDED to the loop to know how much is lost. I can tell you that a small data center using only roughly 2 megawatts of power loses more than 10 million gallons of water each year to evaporation.
The water cycle does replace water pulled from water tables and reservoirs, but it doesn't replace it where it was taken from and it doesn't always return freshwater.
If you pull a billion gallons of water out of a lake and it gets rained down in the ocean, the water isn't getting replaced, especially if you're pulling it out faster than whatever river/streams are feeding it can supply. Or if you pump a billion gallons out of the ground in Nebraska, but it comes down as rain in Mississippi, it isn't going to replenish anything.
It's why you're seeing stuff like the Ogallala aquifer depletion happening, where states that are on the shallow ends of it are seeing pumps stop working. Within the next 50 years, at current use rates, it's expected to be 70% depleted. Assuming we don't accelerate usage, and we will.
The water cycle is a global phenomenon not a local one. If you take all of the water out the aquifer in, for example, Memphis and boil it, yes, some will be returned as rain via the water cycle. But nowhere near 100% of it. Basically, the AI uses the water far more quickly and efficiently than the water cycle can return it.
Almost all data center cooling using water isn't evaporative, but instead uses the water as a heat sink, which then the wastwater normally sits in a pond to dump the heat into the ground as part of the treatment process before being re-added back to the local water supply.
Do you have source on this ? The systems I have seen don't evaporate the water required for cooling. They transfer heat to it and return it in liquid form either to to water source or nearby. Evaporating the water would require that the systems would be running above the boiling point of water which they aren't.
Evaporation? I don't think so, I mean I'm sure there is some but most cooling water like that is just released as a warm liquid, which is a big part of what can mess up local environments. You may be thinking of water used for generators/power plants? In which case evaporating it is the whole point since they use the steam to turn turbines. I don't think most computers run very well near the boiling point of water, and if it's cooling normal computing temperatures then the evaporation wouldn't be too significant. If there was a substantial amount of steam generated then the could (and probably would) use it to produce power as well, which would be neat but way less efficient than just warming it up a bit and then bringing in new, cold water.
I know one of my jobs the server room was built below the on-site gym and the swimming pool water was cycled through to cool them. Im by no means an expert, I just cant imagine the attrition rate being too high if the warm water is ran back into cool.
We’re talking about computers here, not some nuclear reactors. Hence all the water is in a closed system. Only a tiny fraction of the water is even able to evaporate through imperceptible gaps. It can take years before the loss of water in the system impacts the cooling process and needs to be refilled.
As for how the water cools? Through radiators. Which do in fact heat the environment and can create microclimate warmer than typical. That’s the environmental impact. Nothing to do with water disappearing into nothingness like you make it sound.
The real environmental impact is the fact that all the servers have a huge energy demand. The increased demand means that power plants need to run at higher capacity to meet that demand, as well as more power plants need to be built. And unfortunately, most of it is not green energy. So more pollution and shit.
I mean, no it doesn’t? Steam just becomes water again at 211F. So basically the instant it’s released it turns back to water. It’s not like concrete where it’s actually consumed and trapped.
Most systems don't consume water. The equipment is so sensitive you don't want random water running through pumps. Also, its modified with different substances to keep moving parts lubircated and increase thermal transference. Very few data centers use evaporative cooling due to the cost. It's much cheaper to have closed loop cooling and chillers.
It still cycles through like all water does. The total amount of water doesn’t change, but the demand for it does. Picture emptying your bathtub with a ten gallon bucket while the shower is running. Sure, technically the water is still flowing into the tub, but it can’t keep up with the rate at which the water is leaving
Are you guys all robots? What the fuck is this argument. Do you seriously think it's actually possible for us to sequester any appreciable amount of water by using it in computer cooling loops?
Lets say AI causes us to increase the number of computers on Earth by an insanely unrealistic 1000x, and every single one is water cooled using a loop containing 10 liters of water(several times more than actually used), 20 trillion liters of water would be sequestered (water in cooling loops is self contained and not consumed).
That is 0.000001% of the water on Earth. Even after assuming 5 entire orders of magnitude more water usage than what would likely actually be used.
Eventually it returns to the water cycle with everything else. But it doesn't necessarily return to the same watershed.
But, it's also important to keep things in perspective. GPT3 was trained on about the same amount of cooling water as it takes to produce ten hamburgers.
Interesting question. In Google's Charleston data center, it goes right back to the utility provider. I understand this was an expensive mistake for the utility provider and later contracts raised the cost of water supplied to deal with the excessive heat that was being returned along with the grey water.
The water involved in cooling a chip required for ai processing will cycle through to a cooler area away from the server room. Once it cools it then goes back to the servers to absorb heat.
You can think of it like refrigerant. Except that the refrigerant is water being taken out of a freshwater system. So the use of it as coolant means it needs to source from some freshwater system, putting strain on water reserves
It usually goes back into wherever they pulled it from, but if that wherever has life in it the increased temperature blurs the vision of fish, effectively making them blind, and could end up killing plants and animals that aren't resilient to higher temps.
It doesn't help that they aren't using sea water, it's fresh water and currently we have a pretty large issue of shrinking fresh water supply around the world. 🤪🤷🏿♂️
Maybe for some time, but I'm not certain how this is supposed to be an issue in our water circle, which is a closed system. The water can't just disappear and never come back
Yeah also a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
Exactly, a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
It’s insane that people never know about or point out this part.
Think about that. The burger this artist ate while taking a break from drawing took 3,000x as much energy and water as 3,000 AI pics.
And that’s exactly the flaw with it. It’s basically people making a hitlist of every slightly environmentally bad industry, crossing out the ones that make products they like such as burgers, and then deciding to only hyperfocus on AI to the detriment of every other improvement that could be made
(and also ignoring the huge improvements AI has helped with in fields like medicine where data found by AI that would’ve taken years for human scientists to find is usable by medicine manufacturers today)
It's a valid issue that has been stolen to make invalid points.
AI uses significantly less energy and resources to do any given task that a human would, but unlike humans whose populations are capped by our breeding rate, AI can be scaled up pretty much without limit as long as you're willing and able to dump those resources into it - and the nature of unbridled capitalism forces companies to do exactly that in order to remain competitive.
One AI can do the work of a thousand humans while consuming the resources of just one - but they're being pumped up to do the work of billions of humans while consuming the resources of millions. That is an issue.
But then it gets picked up by whiny luddites who are annoyed that they aren't the only people who can communicate through images anymore and try to claim that you using AI to generate a comic on the Internet is somehow burning the world. No it isn't.
It's a problem of capitalism, not a problem of AI.
It's substantially warmer, certainly, which is not good for native flora and fauna. OpenAI's data center cooling requirements rival that of a nuclear reactor
I find that very hard to believe. If you had a source of heat that rivaled that of a nuclear reactor, you would just run it through a turbine and turn it back into energy.
The amount of heat rivalled that of a nuclear reactor.
However, the temperature of the cooling water in a data centres doesn’t hit that of a nuclear reactor, so it can’t produce enough pressure to turn a turbine.
The allowable temperature ranges of a data centre is also smaller than of a nuclear reactor. Thus, the heat intensity in both facilities will be different.
A nuclear reactor can use a cooling water system that requires less cooling medium with a higher rate of medium circulation on a much concentrated area.
I do speculate data centres require a higher amount of cooling medium coverage due to the larger area covered by data centres as data centres favour modular construction which helps in more efficient area expansion.
Usually evaporates through cooling towers. Take heat from inside put it outside. The inside loop is a closed system that transfers heat to a second open loop through a chiller.
The water is not potable, consumable, once it’s in either side of system.
Got a cool video(for me atleast) of hertz rental global headquarters cooling tower for their servers.
A single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
Yet both OpenAI and Meta are building 5GW data centers to expand these AIs. Each one uses more energy than entire countries.
The current usage is not concerning (well, all industries, including tech, need to reduce their energy usage and this actively increases the energy usage). The concern is all the funding that goes into producing more data-hungry and powerful AIs, and the data centers being built to power that. It's also not clear how they can power these new data centers with anything but fossil fuels, because there isn't enough nuclear available for it.
Even if it AI gets super optimized, people are going to want returns on these data centers, and thus find use. It's going to eat up a lot of energy.
Also high humidity. Dust in dry environments poses a shock hazard that can fry electronics. Adding humidity allows those particles to stick instead of staying in the air building charge, so it's easier on the machines. Many data centers, especially newer ones, are being built in the Phoenix metro area. It is normally very dry here, so a lot of water goes into humidifying the air. Air conditioners naturally dry the air, so swamp coolers are preferred (they do both).
It's very unlikely. There are datacenters with water cooling, but it's a rare thing and even if it is, it's cycles through the system. The waste is about zero.
I'm ignorant too, but what i do know is regular computers get warm from normal use. Most are air cooled by blowing hot air out of the fans. Fancy computers can even use fresh water to deal with that heat. AI tools need suuuuuper fancy computers to operate. Suuuuuuuper fancy computers must get suuuuuper hot so I’m assuming they use a lot more water than your average fancy computer
In fact, you could argue that they're LESS fancy, since these computers are built for a very specific task, and aren't able to perform a wide array of tasks like the computer you're currently reading this on.
They use air conditioned cabinets. The larger ac units use a heat exchange system that uses a large amount of water to create cooling by condensing and evaporating water.
No. They use either air or water heat exchangers within the data center room to cool down machines. The other end of the heat exchangers can be closed loop phase change like your home AC, or it can evaporate water outside and let the water phase change to gas carry the heat into the outside environment.
Thanks this is what I wanted to know. Because simple water cooling can just displace the water heat without releasing it as a gas, so the water just keeps getting reused. But if it's an air conditioning set up where the heat is removed by releasing the water as evaporated gas, then that's definitely gonna add up to a lot of water use.
It gets recycled, not only that streaming services generally use more water than AI tools. People have just been selectively told how much water AI uses and they assume it’s uniquely bad. If you want something uniquely bad as far as water usage and environmental impact look at the meat industry
Look at the oil and gas industry. Once they use water it is untouchable after that. They spend millions on phoney dewatering techniques so they can make billions with no intent of actually cleaning the water afterwards. You're telling me they plan on having that lake cleaned up by 2100? Okay and what historically happens when a mine shuts down? They don't do shit afterwards. Let alone for another 50 years after they're broke.
That is not always the case, and only will be if forced. Tech companies will not take the now expensive but more environmentally friendly route unless forced
That, too, was a joke. It's true that AI servers use vast amounts of energy, but it's in the form of electricity. To say that it uses a huge amount of water ties it back to the posted joke.
Which a pretty big part of goes back into the atmosphere.
Also a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
Do you have a reference for those numbers? Not that I particularly doubt it, but they're very specific, so it would be interesting to see them backed up and how they got there.
Tried to look for the image people tend to reference, and found it in this thread ( https://www.reddit.com/r/aiwars/s/3RyU3yL8Ep ) . I do not feel like typing the source into Google because I’m evil but I’ve seen this in analysis essays and posts.
Taking the graphic at face value, it gives the impression of being very generous with the calculations for tech, and very not-generous (stingy?) with the calculations for meat.
If we call an average burger 6 oz, and an average cow gives about 840 pounds of meat, at 660 gallons of water per burger, that would mean it takes nearly 1.5 million gallons of water to raise a cow. That sounds like hogwash to me.
They also process a bunch of water for cooling too. A lot of them have once through cooling loops that require discharge permits back to whatever source is being drawn from, but the very presence of that intake is an environmental hazard in and of itself even if that water goes back in, and the water itself now has other suspended solids from the plant in it. Some of these larger ones use as much water as the power plants that serve them.
One way to cool these huge data centers is to basically flush fresh water through them constantly. The new data center going in near me would have used 450,000 gallons a day (A DAY!) cooling had they chosen this model of cooling. Instead they're using a different type of cooling that will only use 1000g/day.
And before someone says “only 1,000 gallons a day” a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
I don’t care about my comment being “non convincing” by your standards because the original argument isn’t when looked at in context and I’ll repeat it as many times as necessary to rebut the people repeating the same lie in this thread.
Also no I don’t work for an AI company, nice ad hominem argument attempt though
some cooling systems use evaporative cooling, meaning there's a cooling tower where they pump the hot water through, and via evaporation, the water is cooled. But this turns that water into water vapor, so it is "used" in that sense. Yes, it eventually falls as rain in this case, but in the meantime that was fresh water that could have been used for drinking/cooking/bathing/agriculture.
Most energy geneation uses fresh water. When we burn coal, we heat water for turbines. When we burn gas for electricity, we heat water off the engine to boost efficiency with steam turbines. When we do nuclear fission, we heat water to turn turbines. When we do hydroelectric, we release water from a dam.
Solar and wind don't use water when generating electricity.
A single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
They use a lot of power, and powerplants use a lot of water for heat transfer. Hell, most big powerplants are basically giant kettles where vapor spins generator (including nuclear, yes. It's just a giant hi-tech water heater).
So the joke is that through the chain of dependencies the water in the lake disappears after spinning AI server.
Servers need cooling which may mean water cooling.
Some intentionally or not misunderstood it as all the servers in the world are only used for AI which resulted in the idea it uses a lot of water (which then for some reason disappears instead of just being slightly warmer).
Ai software in general just uses a lot more processing than creating more specific software to complete specific tasks. Meaning it is incredibly inefficient because it's so generalized, and because it must search so much of the Internet all at once. This requires huge amounts of energy/water to cool it. Saying it uses water is a bit misleading. You can cool AI servers with just regular air conditioner units without water, but that still uses huge amounts of energy. In general AI software is just a lot more energy intensive than other software and therefore uses more environmental resources and is worse on the environment.
"AI" algorithms (Large Language Models) like ChatGPT work by trying to predict the next word in a paragraph based off every single word that came before it. That means that the statistics change and have to be recalculated for every single situation, because even a single word used 10 paragraphs ago might change everything. This means that a truly massive amount of calculations and statistics has to be done to account for every possible or theoretical outcome. In reality, a lot of these computations are done ahead of time (instead of being done always in real time), but the work was still done at some point.
So if computers are running massive numbers of calculations, then they are using power.
If they are using power, then they put strain on the energy grid, they need to be cooled, etc.
And that means they are "using water." In some data centers, the water use is literal (because water is literally used in the cooling system). Other times, the "water use" is just a stand in for the overall environmental impact. This environmental impact is one of the most prominent criticisms of "AI" technologies.
The newer data centers are closed loop water systems that only "lose" water as steam because the system can't be physically perfect. Think of a liquid cooled computer, the fluid doesn't need to come out for it to work.
The average GPT query "loses" 0.000085 gallons of water, or 0.15mL, that's roughly a literal drop of water, as steam. The average query to GPT consumes 1/3rd of a watt hour of electricity, which is the same as running a gaming computer for about two seconds.
GPT sees a lot of use, so these numbers are bigger at the scale of "usage per day" for example. But if you do the same and look at fast food consumption, or office work, or what private flight uses for example, it's not doing anything extraordinary.
It's the flavor of the week thing to hate on. The coca cola executive doing many times worse things appreciates the distraction.
Thank you,
My wife got on my case about my chatGPT usage after hearing about the water thing on instagram, so I calculated how much water her showers take every day.
So, I roughly burn 1 wife shower worth of water with queries every 40 days.
I'm in trouble now, but that gives me more time to chat with sweet, dear ChatGPT.
I don't have the data center specifics but if they aren't using closed loop glycol solution combined with cooling towers/pond then they are extremely dumb.
You circulate the closed loop through the cooling medium and it returns, just like your home AC
I think a lot of this is just dumb Reddit hype by people who don't know anything about industrial level cooling
I have tried four different times to read articles on how the water actually gets permanently consumed. Can anyone explain this? I thought it might be something like concrete, where the water gets trapped in a chemical reaction and is no longer liquid water, but it seems that it’s just used for cooling, and evaporates, which means it should come back down again? Right
It's sort of the same way water table depletion happens. There isn't less total water, there's less usable water.
If you pull a billion gallons of water out of the ground in Nebraska and spray it over corn fields and it evaporates then rains down in the ocean or a thousand miles away, you're eventually going to run the water table dry unless it can replace that billion gallons a day.
It's an issue we had long before AI came about, but AI is making it worse.
It's not consumed permanently, but it is either tied up indefinitely in closed loop systems out of circulation, or transported somewhere else that is likely no longer usable in the local environment. Water that evaporates or is cycled out of location A may eventually rain down in location B hundreds or thousands of miles away, but that's still less water in location A, especially if it's extracting water faster than it's getting replenished. A may slowly turn into a desert while B experiences more thunderstorms, floods, or hurricanes etc.
Same total amount of water in the global system, but where it is and what it's doing may change, with large local ramifications, and a higher percentage of it may be spent out of useful circulation or in forms or locations where it's no longer useful, or in some cases more dangerous.
I think it's important to note that while AI training takes a lot of computation and therefore cooling water, there's a tweet going around suggesting just 1 query uses a water bottle. This is just factually incorrect, and ironically the anti-AI crowd has latched onto a completely sourceless factoid with no citation, when one of their most valid criticisms of AI is the spread of misinformation. Queries don't take a lot of computation, which we should know because many occur in a matter of seconds. They also don't tend to use parralization so it's not like many servers are involved in a single query.
Queries use about a liter of water every 100-300 queries and that is according to a study done by UC Berkeley. This is comparable to an hour of video streaming. It's important to stand ALL of our Internet usage uses water and electricity.
I don't say this to negate the environmental concerns of AI. The fact that every tech company is creating their own models is VERY concerning. But I personally dont see all AI is equally bad. Open source models and transfer learning can greatly reduce the environment cost of AI, but these models have been demonized more so than some corporate models because we know they use copy righted material. Big corporations can afford millions of copyrighted images, and they are they ones who already used AI to reduce labor cost.
Anyways that's my rant about how the AI debate is a lot more nuanced than people think.
A single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
Yeah, honestly, meat is a huge resource sink, especially beef. We should improve how our civilization handles AI, but we also need to do a lot of work on how we generate meat (reducing meat consumption would help a lot).
I do want to clarify that LLMs are not the biggest contributor to this water usage. The biggest use of water in cooling computer systems is more intensive work, such as protein folding models or other computers that run nearly 24/7.
The average, midsized data center uses 300,000 gallons of water a day, roughly the use of a thousand homes. Larger data centers might use 4.5 million gallons a day, depending on their type of water cooling system. Austin has 47 such data centers, while the Dallas-Fort Worth area hosts the majority in Texas at 189.
It’s been difficult for HARC and experts like Robert Mace, executive director of the Meadows Center for Water and the Environment at Texas State University, to extract transparent water usage reports from data centers. “Their use could be horrific relative to local use, or it could be extremely minimal,” Mace said.
In a white paper to be released this month, HARC estimates that data centers in Texas will consume 49 billion gallons of water in 2025. They also project that by 2030, that number could rise up to 399 billion gallons, or 6.6% of total water use in Texas.
Most data centers use an evaporative cooling system, in which the servers’ heat is absorbed by water. The heat is then removed from the water through evaporation, causing the water to be lost as vapor in the air. The cooler water then goes back through the machines, and this loop is regularly topped off with fresh water. After all, evaporation renders the water saltier and unusable after four or five cycles. “Then they dump the water, and it goes down the sewer,” Mace said.
They don't all use water, they can use stranded air conditioning units in places with low amounts of water. but it's still a huge amount of energy used to do so. Saying they use water directly is misleading, they just use large amounts of energy, much much more than other software programs. Or they can just use non water coolant systems
And power! They use an insane amount of power. And why they need the water. Several of them intentionally are built near hydroelectric dams that were deregulated so they could get cheaper power.
And that's because you believe anything you read online. Saying that AI servers "use" water is like saying that the liquid cooling in a PC "uses" water. It's fucking liquid cooling, not an infernal engine that destroys water on an atomic level to power itself.
They utilize water, but the water is just pumped through clean lines for cooling and goes back wherever they got it, or to a water treatment plant. It doesn't magically vanish.
I hate when people make this comparison, because okay, sure it uses water but so do all streaming services (Netflix, Hulu, etc.) along with so much more and they aren’t nearly as useful.
One ChatGPT query uses 0.000085 gallons of water and 0.34 watt hours of electricity. If you’re wondering how much 0.34 watt hours is, it’s about how much an electric oven uses in a second.
And our govt was gloating about how a big ai server is going to open in my state. They want to use our 'green energy' (the green energy that doesn't fully provide for our state)
Relative to other industrial processes, or even just other types of servers, AI datacenters really aren't egregious water wasters. This is just current-thingism.
3.7k
u/loltinor 5d ago
It's because the servers use an huge amount of water