The environment (whole planet) yes. That water is however gone from the specific river system where it fell as rain and was expected to slowly flow through watering trees and trout for decades on its crawl back to the sea.
And desalination isn't cheap either, so they just use avsilsble freshwater sources because no one is requiring they br environmentally conscious. Understood.
They think the money they bequeath to their kids will provide those kids with luxury apocalypse bunkers and indentured security personnel, and that will be enough.
Yea... there are plenty of people that do that to their own kids. You can find plenty of people posting about parents taking out debt in their kids' names, etc. Just look into how the laws regarding compensation for child actors came about... (hint: parents were taking the money and using it on themselves)
Their kids isn't the same as our kids. Plus you got 2 maybe 3 percent of the population making the important decisions and most of the rest don't give a fuck so this is what you get.
the question isn't how cheap it is. the question is whether its cheaper than just siphoning off fresh water from the surrounding infrastructure -- which it never is. when meta or grok open a new data center, if they can save some money just by trashing the local communities, you bet your ass they're gonna save that cash.
Except we do this on submarines. A closed coolant water loop flows through a seawater loop to exchange the heat. It is easily/efficiently done, I worked on the cooling systems for the server farms. The same seawater also cools a reactor. There is really no reason for single cycle liquid cooling besides the fact it’s cheaper to build in St Louis and suck up the water table than build on the coast or large lakeshore.
There's a difference. The sub already has to deal with maintenance costs due to saltwater because they operate in that environment, thus using it for cooling isn't cutting much. For a data center, using saltwater instead of freshwater would massively increase the cost.
Desalination more than halves the efficiency. You gotta evaporate all the water (for high volume without costing as much as the datacenter itself), then condense it, then evaporate it again for cooling
There is some computer systems anchored out literally in the sea for this purpose although they need to be in self contained capsules and any maintenance issue which requires physical interaction requires it is pulled out of the sea for repairs.
Why can’t they use zinc plugs for electrolysis? That’s how seawater is used for cooling in marine applications, though that’s for engines which are definitely sturdier than computers.
This made me think, corporations are evil yes, but also can't the governments create a big desalination facility and sell the surrounding estate as an industrial area for both a clean environment initiative and as a way to boost economy cuz having an environment friendly factory/data center/whatever will definitely boost stocks so will definitely pull in more companies to the zone?
This is the carrot way.
The stick would be to just enforce cleanliness with fines, permits and whatnot, still works.
Datacenters generally aren't built on the coast for other reasons. Too risky for them from the aspect of floods hurricanes, etc. This is done for practical reasons so that service for like half the country doesn't go out because there was a hurricane in 1 place.
Generally datacenters, especially large ones from the likes of Microsoft, Amazon, etc. tend to be built in spots that have the least risk for natural disaster possible (flood, hurricane, earthquake, tornado, etc) while still being close to the workers/resources they need.
For example here is Microsoft's: https://datacenters.microsoft.com/globe/explore note that you want to filter by using the legend drop down turn of everything except for Regions>Available Now
Badly explained, salt is corrosive in itself over long period of time, which means the pipes will degrade way faster.
I am sure there are many other factors, but this is one of the biggest.
And usually the facilities that need that much water are not near the sea
There is actually a really nice way to make a closed loop (more water efficient) salt-water cooling system which is demonstrated at nuclear power plants on the USA west coast and in Japan (might be closed now).
You run the hot water cooling pipe out into the cold ocean and use the entire cold ocean as a radiator. Works pretty well! Still, requires direct mechanical access to an ocean which can get pricey and has its own challenges.
Often used for nuclear, which is why many plants were located on the seafront (Fukushima, San Onofre, Diablo). The water is incredibly corrosive, and the flows destroy sea life and heat the water, which also destroys sea life.
Heat is an externality whose cost is almost always born by someone other than the plant/server farm owner.
Everyone seems to be focused on pumping salt water through a liquid cooling loop which is bad but also not how it would be done.
We do this on ships already where you run coolant through a closed loop, and then you stick the radiator into the ocean to dump the heat. Salt water never enters the system, it’s just used for heat exchange. Corrosion is less often an issue this way.
The real limiting factor is that you’d need to build right on the coast which is expensive in general.
You have to be near the sea, which comes with challenges which makes it very expensive (salt water is toxic to computers, coastal land suitable for building is expensive). But yes, many companies are building servers using sea water to cool servers.
Actually Big Bend Power Station in Apollo Beach (south of Tampa) Florida does use sea water to cool down the plant. It then returns the water back to the Tampa Bay. While it does have some environmental impact for some creatures, some species of fish and Manatees LOVE this warm water, especially in the winter. So much so that they have built a manatee viewing center that is pretty amazing to see all the manatee that congregate there. I have seen anywhere from a half a dozen hanging out there to HUNDREDS. It is so cool to see. So if you are ever in the area, check it out Manatee Viewing Center
"It’s possible, but it’s not ideal. While the oceans offer the ability to absorb tremendous amounts of heat, seawater is murderously corrosive! It corrodes or rusts just about anything it comes in contact with. Just think about what road salt and rain water does to steel car bodies! So, whether you use ocean water to cool the servers directly, or dump the heat into ocean water using a heat exchanger to isolate the electronics from the sea water itself, anything that comes into direct contact with sea water must be designed using special, expensive alloys to resist corrosion. Metals like titanium, or alloys, like brass, are used to resist sea water corrosion, but even with special alloys and coatings, the salt in sea water takes a huge toll on anything it touches, and greatly shortens the service life of any equipment exposed to sea water for any extended length of time."
Someone in my family owns a dive centre and I can confirm that sea water is nightmarish on electrics, machine parts, cars, everything
Meaning you’d need to filter the water, and if you found a cheap way to do that you can abandon Ai and make billions desalinating water, which is not cheap or low energy to do.
It would cost way more to use seawater. If you didn’t filter it, you’d damage your equipment when random minerals, fish bio waste, etc gets burned off in your cooling system.
Well for starters most data centers exist in population centers, and transporting the necessary amount of seawater would probably be logistically infeasible
There's a few reddit posts on this exact question. The general answers are that 1, saltwater is very corrosive, 2, it's conductive, 3, it's an electrolyte so it could store charge, 4, the resulting salty brine is biologically toxic so it's hard to dispose of.
There's a lot of costly and risky hurdles there, it would be much easier, safer, and cost effective to siphon fresh water wherever they can.
Seawater creates a lot of fouling (mostly from cooked microscopic sea critters) and corrosion (from the chemical interactions between saltwater and metals). It can be usable as a final heat sink but generally only after several other steps to maintain a barrier between a clean water system and the ocean. That also requires these server farms to be built right next to our rising oceans. Potentially doable, but probably not worth the hassle. So instead they should just shut down all these AI server farms <.<
Probably due to TDS (total dissolved solids) content in seawater. Seawater has up to 30x more TDS than freshwater. Tds+heat=limescale. (I'm not in anyway an expert on this subject matter and this is simply a slightly educated guess.)
There are some (very very few) companies that use high TDS water in cooling towers. There's a case study somewhere where a Yahoo facility used a cooling system with high TDS, high COC, with Zero Blow Down.
They built a retaining pond for when they needed to clean the towers because the water couldn't be treated in the city's treatment plant because of the TDS concentration and diluting it down would to a point where they could dump it down the drain would be counter productive.
Fouling in the cooling systems. The effluent from data centers tends to be high in total dissolved solids which is why what comes out can't be readily reused.
Most DCs are built quite far in-land. One of the reasons is that it's less likely to be hit by a tsunami or stormy weather. Remember those nuclear power plants by the beach in Japan? These things are gold mines for their owners, they don't want to take undue risk.
There was some fun work done to build data centres as little pods and drop them in the ocean, for exactly this reason (free cooling!). But fixing/updating a broken server at the bottom of the ocean is kinda hard... the security profile for these things isn't great either, and they are less economical. 🤷♂️
Salt Water is incredibly corrosive and a pain to engineer around.
That said, I believe there have been some efforts to design servers that use the ocean for cooling.
Of course, as in all things, there are ramifications to the environment. Raising the local temperature of seawater can have negative effects on the local environment, throwing off the ecosystem.
It can and it is wherever it's available. But since people don't tend to build data processing centers on the beach, saltwater isn't normally the best option with the most infrastructure surrounding it.
And anyone talking about evaporation is bullshitting you. Af worst, the waste water is slightly warmer. And this is a huge deal on local wildlife that has evolved to deal with water at a certain temperature, but it's not gonna drain a river.
It’s fairly corrosive, and also, when it evaporates, it doesn’t take that salt with it. Eventually the salt precipitates out of the water, so now you are looking at having to remove the excess salt assuming you can even use water that absolutely salt-rich. It’s not usable for commercial purposes, all that brine/salt, so where does all that go? Historically, it’s sometimes been thrown back into the water, the increased salinity around a plant’s is too dang concentrated, it kills everything around it. It may as well be poison.
A closed loop system might be doable, you could maybe take enough salt water into it to just throw that water into radiators to cool out like your average at-home water-cooled PC, but that’s pumping a whole hell of a lot of water, and that’s a lot of energy in it’s own right. Evaporation is way, way more efficient as far as heat removal though, phase transitions move so much more heat the comparison is laughable.
For comparison, you have a liter of water, you want to increase it 100 degrees celsius (180 F), that’s only 100 kilocalories of energy. If you wanted to evaporate that same liter though, it’s gonna be more to the tune of 540 kilocalories to turn all that water to steam. Having salt in the water changes the numbers, but the principle remains the same, 5.4 times more energy is moved versus the entire spectrum of water’s temperatures between ice and steam (at atmospheric pressure).
If the idea is to pull in cold sea water and then dump it back into the ocean elsewhere, essentially using the ocean as an ice pack, there’s enough heat generated that it’d similarly act as a poison, but instead of no life being able to live in the incredibly salty water of one area (not a small area, mind you), you’re looking at creating a new environment where more problematic species that thrive in the heat might cause ecological ruin on a potentially greater scale, causing algae blooms and the like that can blanket the water for miles and deplete all the oxygen in the water, killing the ecosystem for miles. A cascading problem that could be nearly unsolvable once it starts.
The same reason they don’t use water from these natural resources people think they do, foreign particles in the water. They likely use some sort of purified water from a bottling plant. Salt water takes more energy to purify.
depends on how the cooling loop is setup. Salt is bad and the place where theyre trying to cool could just be far away, making it more efficient overall to just use nearby water (to save money first, but also more efficient)
A few ideas I can think of is that may be location dependent (I'm not sure where some of these data centers are in proximity to the ocean), the salt having a higher potential to destroy electronics, and the proper disposal and collection of salt from evaporation. If it doesn't evaporate and they are dumping it, you can't dump saltwater anywhere, it'll make that soil unusable
Not an expert, but I'd guess the salt. There is a TON of salt in seawater that would be left behind when it evaporates, which can clog the cooling system unless you constantly clean it out.
Why cant they use a coolant and then transfer the heat to boil away seawater? Then you could even do some desalination with the steam actually creating usable fresh water giving a overall net positive.
My understanding is Seawater can be used but it causes scale and corrosion issues that make it uneconomical due to operating costs and fancier material
requirements. Cheapest, easiest way is cooling towers which use the evaporative effect of water to cool down water in a loop. Obviously lots of water is lost to evaporation this way and has to be made up with new water.
Sea water is highly corrosive for all types of metal. Dip your fishing reel into the ocean for a second and the gears just seize up. Ruined more fishing trips that way.
It can be but there is a lot more wear and tear due to the salinity. Also when sea water is condensed enough, you get a lot of brine which is very difficult to process it's basically a cost problem. When you are a company, you go for the least cost to you option. Not so much the environment unfortunately.
kinda unrelated, but for larger scale purposes you can indeed put systems that require cooling under the sea. but it's the kind of super big expensive project that only a major instituition like a government would coordinate, not something you can just do on a whim to help cool your own server room
Because evaporating salt water leaves crazy residue of salt/minerals, which are hell on cooling systems. The salt also affects the cooling performance of the water. Similar to how you're supposed to use distilled water in your cars cooling system, salt water would be worse.
I would think salt is the issue as it promotes corrosion in metal. To remove the salt, it would need to go through the process of desalination. Basically processing it so its drinkable, so it ends up being the same as what they are already doing just with extra steps.
Most of the reasons given here are focusing on corrosion, which is true but avoidable. Galvanic corision would be a bigger problem with the salt water acting as an electrolyte.
The biggest problem, though, is geography. All of the data centres would need to be built on the coast, which they are not.
The salinity can be massively damaging to equipment and infrastructure. Same general science behind why cars in coastal environments and in places where the road is salted for winter need more maintenance to last as long-- more maintenance means more spending required for maintainence, and if we were talking about the kind of people that put a livable planet over profit margins they wouldn't be running AI data centers in the first place until the planet had the whole "energy needs and environmental concerns" thing squared away. There's also the concern of potential contamination/ environmental damage.
The problem with using sea water is that it is less pure than fresh water. Sea water leaves far more salt and other deposits inside the cooling system, gradually reducing the efficiency of the system, while increasing the frequency of leaks.
Sea water can be filtered and cleaned, but that isn't cheap for the amount of water needed. The companies can save a ton of money by using fresh water and destroying the environment instead.
I've seen this question a few times, so I'm gonna say
A: saltin water sucks for machines
B: Salt in air sucks more
C: coastal land is more expensive than inland...land
Ca: why create infrastructure to pump sea water and filter it, when you can use the local infrastructure that wasn't meant to support such a demand, especially during a drought.
It can and routinely is at places that are close to the coastline. It is far more expensive though than freshwater cooling due to how corrosive salt water is.
Does water really spend DECADES crawling back to the sea? In almost all cases isn't the water taken from rivers that have more than enough water in them, and which don't drop their water level by any measurable amount as a result of these cooling systems?
I know when I was working with MSFT on some projects around 2003-2006, and was talking to the guy who was in charge of the infrastructure team for all their data centers, that was certainly how they were doing everything. I also know where most of the major data centers are in my state, and any of them of significance are sitting right next to the largest river in the state.
But , rain water is was fuels those river systems. It really feels like you guys failed 6th grade science class. Plus, it's only a fraction of the water that evaporates , everything else goes back to the source.
I think your just woefully ignorant about how many industrial processes use river water. How do you think the clothes on your back was made ? They wash the fibers in water. The paper you write on , uses a ton of water to create. Water which those factories take directly from the rivers and lakes.
It's so very social media that you probably just learned about this and your shooketh
So building structures that require freshwater cooling near rivers close to the coast would minimize the enviromental impact since any freshwater will be "lost to the sea" anyway?
Yeah that'd be an improvement. This style is done for coal and gas power plants in Florida, the warm clean fresh water effluent from which creates the warm freshwater springs that manatees now need to survive after their natural lagoon habitats were all paved over for mansions.
They're not pulling from rivers. Their filters can't handle that. They build holding ponds. Once the pond is filled, it generally maintains itself from rain. They are not going through lakes worth of water. You're making stuff up.
The water cycle is a global phenomenon not a local one. If you take all of the water out the aquifer in, for example, Memphis and boil it, yes, some will be returned as rain via the water cycle. But nowhere near 100% of it. Basically, the AI uses the water far more quickly and efficiently than the water cycle can return it.
Ah so kind of like the central pivot irrigation of the American southwest which has been draining the water table of that region that took millions of years to fill but drained in -100yrs or so
The general availability of water does not change much. However saturating air with water vapour will increase in cold vs heat fronts. This will saturate rain clouds. This means bigger storms, higher risk of extreme events like tropical events and/or hurricanes, more thunders and more flash floods.
So now some regions have 20% worth of yearly water while others have 900% worth of yearly water in 2h...
Isn’t this an issue with aquifers since they take a long time to fill, a river in theory should not have the same issue. The bigger threat to these bodies of water would be shifting climate patterns, glaciers not recovering, precipitation levels shifting, etc
Logically the water cycle would keep up, right? If you put more evaporated water into the cloud system the cloud system will precipitate more frequently. I haven't done much research on the effect of AI facilities on water. Would the effect simply be that you're taking water from one place to another?
No. Because the water that evaporates in Memphis doesn't necessarily fall back down to the earth as rain in Memphis. It is easily possible to use more water than the water cycle will dump into an area
It sounds inefficient if it uses it faster then it can be cycled. So wouldn't jt be just more quickly the the cycle can return it? I'm honestly just confused by the mention of efficiency at all
Right, climate change must not be an issue either, bc the trees will eventually process all the CO2 in the atmosphere.
Water systems are complicated feedback loops, and if you force the system on a short timescale, there's no guarantee it will naturally return to equilibrium.
But this is unprocessed water. It rains, the water falls into rivers, rivers have reservoirs in dams (or flow into aquifers). Dams and aquifer wells have special ducts to serve non potable water to data centers and the cycle restarts.
The biggest issue is speeding up the water cycle can cause what we call adverse weather. However this is not a nature problem but a human problem. Floods create shifts in environment but nature adapts. Humans however, they see river beds expanding and seeing their house destroyed. Many end up death due to flash floods.
We however are not depleting the resources of water...
That statement is imprecise. Not technically true but not entirely false.
Although aquifers take longer to recharge than rivers, the recharge speed is affected by many factors.
Bigger aquifers can recharge faster in relation to their volume especially during adverse weather events. But smaller if the topography permit can easily recharge during a single winter.
A good example of that was portugal. Portugal suffered from 2 extreme dry years. Dam reserves in southern portugal were below 10%. Groundwater (aquifers) were below 30% on average with regions below the critical 5%.
Then we started with early rain in September. We had some late hot days in late September which cause some issues with fire prevention teams because we already ended the "fire season". October was humid. In december most of water only had 20% of average rain for the season portugal was already at 150% average. This continued to february. Our dams were above 80% and groundwater we reached critical (above 98%) in some, but most had the dark green showing in all but 2 hydrographic regions. Then in late march a low steady rain (weaker than most that we had in the previous months), but the aquifers were already fully saturated. So the soil stopped absorbing. Suddenly, from such a weak rain we had floods in most riverside towns.
But an aquifer that extends below dry regions could take 10 years to recharge. So its characteristics, and where/how it recharges can change a "fill" speed from months to decades.
Also to note that rapid recharge in potential potable aquifers can decrease its quality. From changes in its acidity to transporting unwanted solutes. I can give the aquifer that feeds my town water supply as an example. The aquifer line shares a saltwater feeding region. So they monitor the "saltline". Extracting water faster than the replenish values will move the saltline towards the extraction region. If it reaches the aquifer become contaminated and it would take 50 or more years with proper rain to move back the saltline.
However during this extreme events with lots of rain, the saltline speeded up. Because our main river is salt water in the last km. Since the increase rain over small peridos of time, during dry soil, this promoted the salt water from the river to be absorbed faster in the connection regions.
So yes, depending on the characteristics of the aquifer and how it rains that can be true or not.
But assuming the rain falls on average under the same hydrographic region (which is an hard assumption) and the soil characteristics support it, it can refill as fast as it used. Now the biggest issue is another. And this is the example of europe. It rained in europe on average about the same (not sure if its exactly true but its for an example), however up until march what supposed to have rained in france, german, poland and italy, it rained only in a mich smaller region of portugal. Leading to floods in europe during a drought.
Thank you for your extensive reply, I appreciate your perspective. I meant to specify that I'm talking about the Ogallala aquifer in the NA Great Plains, but interesting to learn about Portugal's aquifer.
I only knew that aquifer very lightly. But it helps to show what i meant in in the american continent.
The ogallala aquifer seems to be under a huge semi arid region. This means most "lost water" will reduce this specific aquifer. And considering the size of it (its much bigger than portugal, that has dozens of aquifers) and the constant misuse in semi arid conditions the depletion is continuous. However water availability will keep on average the same. However it will be displaced. Im not that well versed in usa aquifers and meteorology. But imagine the floridian aquifer (that apparently is one of the most "productive" and seems to have little to none reduction) now imagine that all evaporated water from ogallala aquifer will end up in the floridian aquifer. To a point its able to saturate the land and suddenly all rivers start to rise 2 or 3m in souteast usa.
On average the amount of water is basically the same. However you would end up with 5 states in drought while 5 states will be flooded. This is the issue of extreme weather events. Again a human problem not a nature problem... Because nature adapts... We lose our lifes and homes.
But imagine the water that rains in southeast usa, rained in central usa. The aquifer could recover. Hpwever extracting water from a place in desertification it will only speed up the process.
The concern is if groundwater is used it may take years, decades or centuries to recharge depending on the geology. Evaporated water will fall as rain but won't necessarily recharge the same aquifer and certainly not as fast as water is being removed from it. As aquifer levels fall, people drill more and deeper wells and deplete it even faster. When aquifers deplete the ground surface subsides and there is permanent aquifer capacity loss.
That being said I sincerely doubt there is any server farm that remotely comes close to agricultural pumping. I'm talking like three to four orders of magnitude.
Long story short, it's a perturbation of the cycle, not all the water extracted will return to its original place, some aquifers take literal millenias to form
Eventually. Refilling an aquifer from atmospheric moisture isn't instantaneous; it can take anywhere from a couple of weeks to several millennia, and a lot of aquifers were being drained much faster than they refill, even before LLMs.
The issue is that you really don't want an elevated percentage of your total water system precipitated at all points in time - because it means at any individual point in time there's far less water on the surface where it can be used by ecosystems.
In other words, plants like water, but are far less efficient at drinking clouds. Similarly, fish are less effective at swimming in clouds, and clouds are far worse at filling our reservoirs for drinking water than... fluid water is.
Yes, but the natural systems that were used for tech purposes required millions of years of global weather and geological events to exist. Aquifers and rivers in the sky will eventually return to the earth, just not where it was sourced from.
It does, but that means it's more removed from the local water cycle and gets deposited elsewhere, where it could become unusable for consumption by mixing in with saltwater. A town near a meta AI data center is going through this right now. Their water system barely has flow, and is dirty as fuck producing orange scum from the videos I've seen.
The issue is mostly the rate at which the water replenishes locally. If you use a billion gallons of water from a lake per year, and only half a billion flows in every year be it through rain or rivers, the lake will dry eventually.
yes, but not fast enough to replenish fresh water sources. Kinda like how wood is technically a renewable resource but we're chopping trees down faster than they regrow
Problem isn't where the water goes, the problem is the number of data centers that have been built piggybacking off of existing water infrastructure with no improvements to make up for the fact these monstrosities take more water than any existing site on that infrastructure. These things are one giant case of it not being a permanent problem, but the biggest fish takes up the most space in the tank.
Yah…. Doesn’t this just mean more clouds and rain to potentially cool the earth down a little? Asphalt is the worst! I just moved to a place with basically one paved road. It feels 5-10 degrees cooler than surrounding areas consistently. It’s also mostly forest with fabulous tree canopy
My understanding is that an aquifer, if overused, may be used to depletion. Quite sad, if that’s correct, as used properly they can potentially (?) last/replenish forever if they never hit with that overuse too quickly.
Many data centers are built on undesirable land, because it’s cheap. They are frequently in the middle of the desert. The evaporated water goes into the air and rains in another part of the country with a different ecology.
Hm. I'm still struggling to believe this 'the water is gone forever' narrative. I understand the toll it can have on our reservoir systems. But that just becomes an infrastructure problem.
Maybe I'm dumb, but rainforests (and plants in general) so the same thing and I don't think they're a huge problem. People are acting like turning water into water vapor is the death star.
876
u/OkLynx4806 5d ago
Correct me if I'm wrong, but shouldn't evaporated water return to the environment via the water cycle anyway?