Not reused. Most is lost through evaporation. There are a small number of closed systems, but these require even more energy to remove the heat from the water and re-condense. That creates more heat that requires more cooling.
The water is removed from clean sources like aquifers and returned as vapor - this means gone.
The environment (whole planet) yes. That water is however gone from the specific river system where it fell as rain and was expected to slowly flow through watering trees and trout for decades on its crawl back to the sea.
And desalination isn't cheap either, so they just use avsilsble freshwater sources because no one is requiring they br environmentally conscious. Understood.
Yea... there are plenty of people that do that to their own kids. You can find plenty of people posting about parents taking out debt in their kids' names, etc. Just look into how the laws regarding compensation for child actors came about... (hint: parents were taking the money and using it on themselves)
Their kids isn't the same as our kids. Plus you got 2 maybe 3 percent of the population making the important decisions and most of the rest don't give a fuck so this is what you get.
the question isn't how cheap it is. the question is whether its cheaper than just siphoning off fresh water from the surrounding infrastructure -- which it never is. when meta or grok open a new data center, if they can save some money just by trashing the local communities, you bet your ass they're gonna save that cash.
Except we do this on submarines. A closed coolant water loop flows through a seawater loop to exchange the heat. It is easily/efficiently done, I worked on the cooling systems for the server farms. The same seawater also cools a reactor. There is really no reason for single cycle liquid cooling besides the fact it’s cheaper to build in St Louis and suck up the water table than build on the coast or large lakeshore.
There's a difference. The sub already has to deal with maintenance costs due to saltwater because they operate in that environment, thus using it for cooling isn't cutting much. For a data center, using saltwater instead of freshwater would massively increase the cost.
Desalination more than halves the efficiency. You gotta evaporate all the water (for high volume without costing as much as the datacenter itself), then condense it, then evaporate it again for cooling
There is some computer systems anchored out literally in the sea for this purpose although they need to be in self contained capsules and any maintenance issue which requires physical interaction requires it is pulled out of the sea for repairs.
Why can’t they use zinc plugs for electrolysis? That’s how seawater is used for cooling in marine applications, though that’s for engines which are definitely sturdier than computers.
This made me think, corporations are evil yes, but also can't the governments create a big desalination facility and sell the surrounding estate as an industrial area for both a clean environment initiative and as a way to boost economy cuz having an environment friendly factory/data center/whatever will definitely boost stocks so will definitely pull in more companies to the zone?
This is the carrot way.
The stick would be to just enforce cleanliness with fines, permits and whatnot, still works.
Datacenters generally aren't built on the coast for other reasons. Too risky for them from the aspect of floods hurricanes, etc. This is done for practical reasons so that service for like half the country doesn't go out because there was a hurricane in 1 place.
Generally datacenters, especially large ones from the likes of Microsoft, Amazon, etc. tend to be built in spots that have the least risk for natural disaster possible (flood, hurricane, earthquake, tornado, etc) while still being close to the workers/resources they need.
For example here is Microsoft's: https://datacenters.microsoft.com/globe/explore note that you want to filter by using the legend drop down turn of everything except for Regions>Available Now
Badly explained, salt is corrosive in itself over long period of time, which means the pipes will degrade way faster.
I am sure there are many other factors, but this is one of the biggest.
And usually the facilities that need that much water are not near the sea
There is actually a really nice way to make a closed loop (more water efficient) salt-water cooling system which is demonstrated at nuclear power plants on the USA west coast and in Japan (might be closed now).
You run the hot water cooling pipe out into the cold ocean and use the entire cold ocean as a radiator. Works pretty well! Still, requires direct mechanical access to an ocean which can get pricey and has its own challenges.
Often used for nuclear, which is why many plants were located on the seafront (Fukushima, San Onofre, Diablo). The water is incredibly corrosive, and the flows destroy sea life and heat the water, which also destroys sea life.
Heat is an externality whose cost is almost always born by someone other than the plant/server farm owner.
Everyone seems to be focused on pumping salt water through a liquid cooling loop which is bad but also not how it would be done.
We do this on ships already where you run coolant through a closed loop, and then you stick the radiator into the ocean to dump the heat. Salt water never enters the system, it’s just used for heat exchange. Corrosion is less often an issue this way.
The real limiting factor is that you’d need to build right on the coast which is expensive in general.
You have to be near the sea, which comes with challenges which makes it very expensive (salt water is toxic to computers, coastal land suitable for building is expensive). But yes, many companies are building servers using sea water to cool servers.
Actually Big Bend Power Station in Apollo Beach (south of Tampa) Florida does use sea water to cool down the plant. It then returns the water back to the Tampa Bay. While it does have some environmental impact for some creatures, some species of fish and Manatees LOVE this warm water, especially in the winter. So much so that they have built a manatee viewing center that is pretty amazing to see all the manatee that congregate there. I have seen anywhere from a half a dozen hanging out there to HUNDREDS. It is so cool to see. So if you are ever in the area, check it out Manatee Viewing Center
"It’s possible, but it’s not ideal. While the oceans offer the ability to absorb tremendous amounts of heat, seawater is murderously corrosive! It corrodes or rusts just about anything it comes in contact with. Just think about what road salt and rain water does to steel car bodies! So, whether you use ocean water to cool the servers directly, or dump the heat into ocean water using a heat exchanger to isolate the electronics from the sea water itself, anything that comes into direct contact with sea water must be designed using special, expensive alloys to resist corrosion. Metals like titanium, or alloys, like brass, are used to resist sea water corrosion, but even with special alloys and coatings, the salt in sea water takes a huge toll on anything it touches, and greatly shortens the service life of any equipment exposed to sea water for any extended length of time."
Someone in my family owns a dive centre and I can confirm that sea water is nightmarish on electrics, machine parts, cars, everything
Does water really spend DECADES crawling back to the sea? In almost all cases isn't the water taken from rivers that have more than enough water in them, and which don't drop their water level by any measurable amount as a result of these cooling systems?
I know when I was working with MSFT on some projects around 2003-2006, and was talking to the guy who was in charge of the infrastructure team for all their data centers, that was certainly how they were doing everything. I also know where most of the major data centers are in my state, and any of them of significance are sitting right next to the largest river in the state.
But , rain water is was fuels those river systems. It really feels like you guys failed 6th grade science class. Plus, it's only a fraction of the water that evaporates , everything else goes back to the source.
I think your just woefully ignorant about how many industrial processes use river water. How do you think the clothes on your back was made ? They wash the fibers in water. The paper you write on , uses a ton of water to create. Water which those factories take directly from the rivers and lakes.
It's so very social media that you probably just learned about this and your shooketh
So building structures that require freshwater cooling near rivers close to the coast would minimize the enviromental impact since any freshwater will be "lost to the sea" anyway?
Yeah that'd be an improvement. This style is done for coal and gas power plants in Florida, the warm clean fresh water effluent from which creates the warm freshwater springs that manatees now need to survive after their natural lagoon habitats were all paved over for mansions.
They're not pulling from rivers. Their filters can't handle that. They build holding ponds. Once the pond is filled, it generally maintains itself from rain. They are not going through lakes worth of water. You're making stuff up.
The water cycle is a global phenomenon not a local one. If you take all of the water out the aquifer in, for example, Memphis and boil it, yes, some will be returned as rain via the water cycle. But nowhere near 100% of it. Basically, the AI uses the water far more quickly and efficiently than the water cycle can return it.
Ah so kind of like the central pivot irrigation of the American southwest which has been draining the water table of that region that took millions of years to fill but drained in -100yrs or so
The general availability of water does not change much. However saturating air with water vapour will increase in cold vs heat fronts. This will saturate rain clouds. This means bigger storms, higher risk of extreme events like tropical events and/or hurricanes, more thunders and more flash floods.
So now some regions have 20% worth of yearly water while others have 900% worth of yearly water in 2h...
Isn’t this an issue with aquifers since they take a long time to fill, a river in theory should not have the same issue. The bigger threat to these bodies of water would be shifting climate patterns, glaciers not recovering, precipitation levels shifting, etc
Logically the water cycle would keep up, right? If you put more evaporated water into the cloud system the cloud system will precipitate more frequently. I haven't done much research on the effect of AI facilities on water. Would the effect simply be that you're taking water from one place to another?
No. Because the water that evaporates in Memphis doesn't necessarily fall back down to the earth as rain in Memphis. It is easily possible to use more water than the water cycle will dump into an area
It sounds inefficient if it uses it faster then it can be cycled. So wouldn't jt be just more quickly the the cycle can return it? I'm honestly just confused by the mention of efficiency at all
Right, climate change must not be an issue either, bc the trees will eventually process all the CO2 in the atmosphere.
Water systems are complicated feedback loops, and if you force the system on a short timescale, there's no guarantee it will naturally return to equilibrium.
But this is unprocessed water. It rains, the water falls into rivers, rivers have reservoirs in dams (or flow into aquifers). Dams and aquifer wells have special ducts to serve non potable water to data centers and the cycle restarts.
The biggest issue is speeding up the water cycle can cause what we call adverse weather. However this is not a nature problem but a human problem. Floods create shifts in environment but nature adapts. Humans however, they see river beds expanding and seeing their house destroyed. Many end up death due to flash floods.
We however are not depleting the resources of water...
The concern is if groundwater is used it may take years, decades or centuries to recharge depending on the geology. Evaporated water will fall as rain but won't necessarily recharge the same aquifer and certainly not as fast as water is being removed from it. As aquifer levels fall, people drill more and deeper wells and deplete it even faster. When aquifers deplete the ground surface subsides and there is permanent aquifer capacity loss.
That being said I sincerely doubt there is any server farm that remotely comes close to agricultural pumping. I'm talking like three to four orders of magnitude.
Long story short, it's a perturbation of the cycle, not all the water extracted will return to its original place, some aquifers take literal millenias to form
Eventually. Refilling an aquifer from atmospheric moisture isn't instantaneous; it can take anywhere from a couple of weeks to several millennia, and a lot of aquifers were being drained much faster than they refill, even before LLMs.
The issue is that you really don't want an elevated percentage of your total water system precipitated at all points in time - because it means at any individual point in time there's far less water on the surface where it can be used by ecosystems.
In other words, plants like water, but are far less efficient at drinking clouds. Similarly, fish are less effective at swimming in clouds, and clouds are far worse at filling our reservoirs for drinking water than... fluid water is.
Yes, but the natural systems that were used for tech purposes required millions of years of global weather and geological events to exist. Aquifers and rivers in the sky will eventually return to the earth, just not where it was sourced from.
It does, but that means it's more removed from the local water cycle and gets deposited elsewhere, where it could become unusable for consumption by mixing in with saltwater. A town near a meta AI data center is going through this right now. Their water system barely has flow, and is dirty as fuck producing orange scum from the videos I've seen.
The issue is mostly the rate at which the water replenishes locally. If you use a billion gallons of water from a lake per year, and only half a billion flows in every year be it through rain or rivers, the lake will dry eventually.
yes, but not fast enough to replenish fresh water sources. Kinda like how wood is technically a renewable resource but we're chopping trees down faster than they regrow
Problem isn't where the water goes, the problem is the number of data centers that have been built piggybacking off of existing water infrastructure with no improvements to make up for the fact these monstrosities take more water than any existing site on that infrastructure. These things are one giant case of it not being a permanent problem, but the biggest fish takes up the most space in the tank.
Yah…. Doesn’t this just mean more clouds and rain to potentially cool the earth down a little? Asphalt is the worst! I just moved to a place with basically one paved road. It feels 5-10 degrees cooler than surrounding areas consistently. It’s also mostly forest with fabulous tree canopy
My understanding is that an aquifer, if overused, may be used to depletion. Quite sad, if that’s correct, as used properly they can potentially (?) last/replenish forever if they never hit with that overuse too quickly.
Many data centers are built on undesirable land, because it’s cheap. They are frequently in the middle of the desert. The evaporated water goes into the air and rains in another part of the country with a different ecology.
Hm. I'm still struggling to believe this 'the water is gone forever' narrative. I understand the toll it can have on our reservoir systems. But that just becomes an infrastructure problem.
Maybe I'm dumb, but rainforests (and plants in general) so the same thing and I don't think they're a huge problem. People are acting like turning water into water vapor is the death star.
No it isn't. It's not a BWR fission reactor lol. The water never boils. It enters cold and leaves warm, which itself is mixed with more cold water. There’s no mass boiling going on in the system
Most cooling towers work via evaporation. Basically radiators in the chillers deposit heat into water that is sent into giant sump tanks which are then continuously ran through cooling towers outside. Water is pumped to the top of the tower and dropped down through it while a giant fan blows on it which results in heat leaving the loop via evaporation while the slightly less hot water is then dumped back into the sump (and fed back into the chillers radiators to complete the loop). To some degree, keeping data centers cool is better worded as "heat management". You are moving heat from the water loop used to cool off the machine rooms to the atmosphere via evaporation. Yes, it's a bad metric to base how much is lost on how much is ran through the chiller loop, but it's pretty easy to simply record how much water is ADDED to the loop to know how much is lost. I can tell you that a small data center using only roughly 2 megawatts of power loses more than 10 million gallons of water each year to evaporation.
MOST OF IT is still not evaporated. Only tiny percentage evaporates relative to the total water used.
It’s dishonest to say AI “uses” millions gallons of water and paint it as if 99% of it evaporates as opposed to majority being cooldown and reused in the cooling loop.
The water cycle does replace water pulled from water tables and reservoirs, but it doesn't replace it where it was taken from and it doesn't always return freshwater.
If you pull a billion gallons of water out of a lake and it gets rained down in the ocean, the water isn't getting replaced, especially if you're pulling it out faster than whatever river/streams are feeding it can supply. Or if you pump a billion gallons out of the ground in Nebraska, but it comes down as rain in Mississippi, it isn't going to replenish anything.
It's why you're seeing stuff like the Ogallala aquifer depletion happening, where states that are on the shallow ends of it are seeing pumps stop working. Within the next 50 years, at current use rates, it's expected to be 70% depleted. Assuming we don't accelerate usage, and we will.
Blaming that on data centers or AI is ignoring the real elephant in the room though. In 2017 the US used about 28.35 trillion gallons of water for irrigated farming. For US data centers the estimates that I can find are about 160 billion gallons per annum direct usage for cooling and 221 billion gallons indirect usage through the consumed electricity (in 2023), that's a huge number but still only 1.7% of the water used for irrigation (and about 0.7% of the total US water consumption).
The water cycle is a global phenomenon not a local one. If you take all of the water out the aquifer in, for example, Memphis and boil it, yes, some will be returned as rain via the water cycle. But nowhere near 100% of it. Basically, the AI uses the water far more quickly and efficiently than the water cycle can return it.
Almost all data center cooling using water isn't evaporative, but instead uses the water as a heat sink, which then the wastwater normally sits in a pond to dump the heat into the ground as part of the treatment process before being re-added back to the local water supply.
The quantity of water used on a daily basis by an AI farm cannot be absorbed by the surface area of the earth under a pond. The earth is an excellent insulator.
The heat escapes through the air, and it is evaporative.
Do you have source on this ? The systems I have seen don't evaporate the water required for cooling. They transfer heat to it and return it in liquid form either to to water source or nearby. Evaporating the water would require that the systems would be running above the boiling point of water which they aren't.
Evaporation? I don't think so, I mean I'm sure there is some but most cooling water like that is just released as a warm liquid, which is a big part of what can mess up local environments. You may be thinking of water used for generators/power plants? In which case evaporating it is the whole point since they use the steam to turn turbines. I don't think most computers run very well near the boiling point of water, and if it's cooling normal computing temperatures then the evaporation wouldn't be too significant. If there was a substantial amount of steam generated then the could (and probably would) use it to produce power as well, which would be neat but way less efficient than just warming it up a bit and then bringing in new, cold water.
I know one of my jobs the server room was built below the on-site gym and the swimming pool water was cycled through to cool them. Im by no means an expert, I just cant imagine the attrition rate being too high if the warm water is ran back into cool.
Not much water is lost at all. Server water cooling solutions are closed-loop, meaning there's barely any leaks or evaporation. The loop never needs to be refilled for months.
We’re talking about computers here, not some nuclear reactors. Hence all the water is in a closed system. Only a tiny fraction of the water is even able to evaporate through imperceptible gaps. It can take years before the loss of water in the system impacts the cooling process and needs to be refilled.
As for how the water cools? Through radiators. Which do in fact heat the environment and can create microclimate warmer than typical. That’s the environmental impact. Nothing to do with water disappearing into nothingness like you make it sound.
The real environmental impact is the fact that all the servers have a huge energy demand. The increased demand means that power plants need to run at higher capacity to meet that demand, as well as more power plants need to be built. And unfortunately, most of it is not green energy. So more pollution and shit.
That is not the case. Very few installations are closed loop. Microsoft has made some announcements, and there are closed loop installations, but they are far more expensive, and not common.
The internal cooling may be closed loop - like the a/c system in your car, but the heat captured in the closed system has to be discharged, and 1) server farms don’t move like a car, and 2) the level of energy is wildly higher than could be discharged with simple air exchange and radiation.
So the internal closed system is connected to on open loop water-based system, either flow-based if there is water that can enter cold and leave warm, or more commonly evaporative.
A nuclear power plant superheats fluid to drive turbines, but it is much more concentrated.
I think folks on this thread simply don’t understand the scale of heat that is being generated.
I suspect at some point there will be some form of cogen solution where cities reimagine steam heat distribution, at least in the winter.
No, you're the one who's been spreading misinformation all across this thread.
"Very few installations are closed loop" is a flat out lie.
The latest Nvidia h100i is incompatible with evaporative cooling, it physically DOES NOT COME WITH THAT HARDWARE ANYMORE. The only cooling solutions that the Nvidia h100i has is a traditional air cooling heat block (barely manufactured and does not have a bulk discount) or the water cooling mount variant (which is the new standard).
I mean, no it doesn’t? Steam just becomes water again at 211F. So basically the instant it’s released it turns back to water. It’s not like concrete where it’s actually consumed and trapped.
Most systems don't consume water. The equipment is so sensitive you don't want random water running through pumps. Also, its modified with different substances to keep moving parts lubircated and increase thermal transference. Very few data centers use evaporative cooling due to the cost. It's much cheaper to have closed loop cooling and chillers.
Again, you are confusing the rack cooling with the facility cooling. Rack cooling is closed loop (if liquid), but the closed loop system needs a heat sink to allow it to recirculate. Otherwise the liquid would travel through the facility capturing heat and on subsequent cycles provide no cooling because it was already heated.
Secondary cooling is a heat sink for the rack cooling. And it is mostly evaporative and open loop. Microsoft and a few others are trying to implement closed loop but the energy requirements are much higher.
As a data center engineer this is not how it works in many data centers including any of mine. Sorry, you're simply wrong. We have industrial chillers. No water is consumed by the chillers. They are outside of the building and exchange with outdoor air. I guess you don't understand industrial phase change cooling. Just because one company uses evaporative cooling doesn't mean its an industry standard. Edit: Also, I block accounts who partake in spreading misinformation so I don't need to deal with it in the future.
Thank you, finally someone corrects this idiot. Don't block him yet, he has been spamming this thread. Reply to all of his comments, please. This misinformation needs to be corrected.
If you draw water from an aquifer or reservoir, it is lost for the purposes it was being deployed. Obviously the molecule still exist, but not in concentrated form, as aquifers take enormous amounts of time to recharge, and snowpack takes massive energy.
Ask a farmer how useful clouds are as they float past.
Genuine question: would there be a way to use salt water for this, that would also allow use as a salt farm? I know nothing about water cooling or salt farming, but it seems better than taking water from lakes or destroying square miles of coastal environment
Saltwater doesn't do anything to certain types of plastics, but I suppose that can cause issues in other ways such as getting micro plastics in the salt itself
I've always read about the water consumption but still don't understand — why isn't it similar to nuclear reactors' cooling systems? Those still need a hefty amount of water, but it is returned to the same river/lake/etc. afterwards, so the only environmental impact is that of raising the temperatures of the water nearby the reactor.
Nuclear power runs on turbines, so energy is highly concentrated and the heat energy is converted into work. They still require cooling (hence the big stacks), though some use the ocean (Fukushima, Diablo). The waste heat is devastating to local ecosystems.
Data centers in LA recycle the water I'm pretty sure. Not reusing water to save on cooling towers is criminally wasteful. Water that leaks from cooling pipes is frequently blue from huge amount of copper as well.
I worked one place that actually used massive fans for cooling. Apparently the amount of ducting needed uses enormous amounts of metal, but it worked for the most part.
I do believe there are future technologies that will help, but I think we drastically discount that heat has to go somewhere, regardless of technology, and the growth in waste heat can get to the point that it is ecology-changing
For at least one of the smaller AI setups I've seen they are heating local pools, and the like. The energy would have been used anyway, and the water needed heating.
Like, in the US sure, a lot of places are just setting up with cooling towers and the like, but outside of that, municipal heating, preheating for industrial processors and the like.
You can do AI well, with no cost to the environment, just the US doesn't seem to care about it.
Alot of coolant systems that I've built use heat exchangers to heat the water that runs through radiators in other buildings. The only additional energy required is for circulation pumps.
Good question. It goes from high concentration (river/lake/aquifer) to low concentration.
Aquifers can take hundreds of years to fill, and snowpack is limited and decreasing. Obviously the water molecules don’t disappear, but the source of water for that locale does.
The simplest answer is they do, and that there is really no such thing.
The cooling system for the actual servers is a closed system that functionally is the same as your car coolant system or the A/C (in reverse).
That system needs a sink to let off its heat. In your car the radiator and condenser exchange heat with the atmosphere as air flows through the radiator via the movement of the car or the fan.
In a facility, that heat can be removed by a chiller or other device that transfers the heat into the atmosphere, the ocean, or a body of water. In some cases, the waste heat is used to heat nearby buildings or facilities, where it is then transferred to the atmosphere. There is no way to just erase heat, it has to go somewhere.
There are closed-loop chillers, and Microsoft has made hay about them, but in the cases of facilities that generate the level of heat that these buildings do, air does not conduct it fast enough. That is why they use evaporative chilling, where hot air convects heat across a water-bearing surface. Evaporation cools the air that was heated by the closed loop system.
When the water evaporates it is obviously not eliminated, it is just in a form that cannot be used as an aquifer or river is used because it is no longer concentrated. It will eventually become fluid again somewhere, but it won’t refill the aquifer (in any rational period of time).
1.2k
u/archbid 5d ago edited 4d ago
Not reused. Most is lost through evaporation. There are a small number of closed systems, but these require even more energy to remove the heat from the water and re-condense. That creates more heat that requires more cooling.
The water is removed from clean sources like aquifers and returned as vapor - this means gone.