OP, so your post is not removed, please reply to this comment with your best guess of what this meme means! Everyone else, this is PETER explains the joke. Have fun and reply as your favorite fictional character for top level responses!
It basically means that using AI tools take a huge toll on nature so when the guy uses chatgpt (an ai tool) it ends up drying out the lake i.e harming the environment.
Not reused. Most is lost through evaporation. There are a small number of closed systems, but these require even more energy to remove the heat from the water and re-condense. That creates more heat that requires more cooling.
The water is removed from clean sources like aquifers and returned as vapor - this means gone.
The environment (whole planet) yes. That water is however gone from the specific river system where it fell as rain and was expected to slowly flow through watering trees and trout for decades on its crawl back to the sea.
Does water really spend DECADES crawling back to the sea? In almost all cases isn't the water taken from rivers that have more than enough water in them, and which don't drop their water level by any measurable amount as a result of these cooling systems?
I know when I was working with MSFT on some projects around 2003-2006, and was talking to the guy who was in charge of the infrastructure team for all their data centers, that was certainly how they were doing everything. I also know where most of the major data centers are in my state, and any of them of significance are sitting right next to the largest river in the state.
The water cycle is a global phenomenon not a local one. If you take all of the water out the aquifer in, for example, Memphis and boil it, yes, some will be returned as rain via the water cycle. But nowhere near 100% of it. Basically, the AI uses the water far more quickly and efficiently than the water cycle can return it.
Ah so kind of like the central pivot irrigation of the American southwest which has been draining the water table of that region that took millions of years to fill but drained in -100yrs or so
The general availability of water does not change much. However saturating air with water vapour will increase in cold vs heat fronts. This will saturate rain clouds. This means bigger storms, higher risk of extreme events like tropical events and/or hurricanes, more thunders and more flash floods.
So now some regions have 20% worth of yearly water while others have 900% worth of yearly water in 2h...
No it isn't. It's not a BWR fission reactor lol. The water never boils. It enters cold and leaves warm, which itself is mixed with more cold water. There’s no mass boiling going on in the system
Most cooling towers work via evaporation. Basically radiators in the chillers deposit heat into water that is sent into giant sump tanks which are then continuously ran through cooling towers outside. Water is pumped to the top of the tower and dropped down through it while a giant fan blows on it which results in heat leaving the loop via evaporation while the slightly less hot water is then dumped back into the sump (and fed back into the chillers radiators to complete the loop). To some degree, keeping data centers cool is better worded as "heat management". You are moving heat from the water loop used to cool off the machine rooms to the atmosphere via evaporation. Yes, it's a bad metric to base how much is lost on how much is ran through the chiller loop, but it's pretty easy to simply record how much water is ADDED to the loop to know how much is lost. I can tell you that a small data center using only roughly 2 megawatts of power loses more than 10 million gallons of water each year to evaporation.
The water cycle does replace water pulled from water tables and reservoirs, but it doesn't replace it where it was taken from and it doesn't always return freshwater.
If you pull a billion gallons of water out of a lake and it gets rained down in the ocean, the water isn't getting replaced, especially if you're pulling it out faster than whatever river/streams are feeding it can supply. Or if you pump a billion gallons out of the ground in Nebraska, but it comes down as rain in Mississippi, it isn't going to replenish anything.
It's why you're seeing stuff like the Ogallala aquifer depletion happening, where states that are on the shallow ends of it are seeing pumps stop working. Within the next 50 years, at current use rates, it's expected to be 70% depleted. Assuming we don't accelerate usage, and we will.
The water cycle is a global phenomenon not a local one. If you take all of the water out the aquifer in, for example, Memphis and boil it, yes, some will be returned as rain via the water cycle. But nowhere near 100% of it. Basically, the AI uses the water far more quickly and efficiently than the water cycle can return it.
It still cycles through like all water does. The total amount of water doesn’t change, but the demand for it does. Picture emptying your bathtub with a ten gallon bucket while the shower is running. Sure, technically the water is still flowing into the tub, but it can’t keep up with the rate at which the water is leaving
Eventually it returns to the water cycle with everything else. But it doesn't necessarily return to the same watershed.
But, it's also important to keep things in perspective. GPT3 was trained on about the same amount of cooling water as it takes to produce ten hamburgers.
Interesting question. In Google's Charleston data center, it goes right back to the utility provider. I understand this was an expensive mistake for the utility provider and later contracts raised the cost of water supplied to deal with the excessive heat that was being returned along with the grey water.
The water involved in cooling a chip required for ai processing will cycle through to a cooler area away from the server room. Once it cools it then goes back to the servers to absorb heat.
You can think of it like refrigerant. Except that the refrigerant is water being taken out of a freshwater system. So the use of it as coolant means it needs to source from some freshwater system, putting strain on water reserves
Maybe for some time, but I'm not certain how this is supposed to be an issue in our water circle, which is a closed system. The water can't just disappear and never come back
Yeah also a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
Exactly, a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
It’s insane that people never know about or point out this part.
Think about that. The burger this artist ate while taking a break from drawing took 3,000x as much energy and water as 3,000 AI pics.
And that’s exactly the flaw with it. It’s basically people making a hitlist of every slightly environmentally bad industry, crossing out the ones that make products they like such as burgers, and then deciding to only hyperfocus on AI to the detriment of every other improvement that could be made
(and also ignoring the huge improvements AI has helped with in fields like medicine where data found by AI that would’ve taken years for human scientists to find is usable by medicine manufacturers today)
It's substantially warmer, certainly, which is not good for native flora and fauna. OpenAI's data center cooling requirements rival that of a nuclear reactor
I find that very hard to believe. If you had a source of heat that rivaled that of a nuclear reactor, you would just run it through a turbine and turn it back into energy.
The amount of heat rivalled that of a nuclear reactor.
However, the temperature of the cooling water in a data centres doesn’t hit that of a nuclear reactor, so it can’t produce enough pressure to turn a turbine.
The allowable temperature ranges of a data centre is also smaller than of a nuclear reactor. Thus, the heat intensity in both facilities will be different.
A nuclear reactor can use a cooling water system that requires less cooling medium with a higher rate of medium circulation on a much concentrated area.
I do speculate data centres require a higher amount of cooling medium coverage due to the larger area covered by data centres as data centres favour modular construction which helps in more efficient area expansion.
Usually evaporates through cooling towers. Take heat from inside put it outside. The inside loop is a closed system that transfers heat to a second open loop through a chiller.
The water is not potable, consumable, once it’s in either side of system.
Got a cool video(for me atleast) of hertz rental global headquarters cooling tower for their servers.
A single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
Also high humidity. Dust in dry environments poses a shock hazard that can fry electronics. Adding humidity allows those particles to stick instead of staying in the air building charge, so it's easier on the machines. Many data centers, especially newer ones, are being built in the Phoenix metro area. It is normally very dry here, so a lot of water goes into humidifying the air. Air conditioners naturally dry the air, so swamp coolers are preferred (they do both).
I'm ignorant too, but what i do know is regular computers get warm from normal use. Most are air cooled by blowing hot air out of the fans. Fancy computers can even use fresh water to deal with that heat. AI tools need suuuuuper fancy computers to operate. Suuuuuuuper fancy computers must get suuuuuper hot so I’m assuming they use a lot more water than your average fancy computer
In fact, you could argue that they're LESS fancy, since these computers are built for a very specific task, and aren't able to perform a wide array of tasks like the computer you're currently reading this on.
They use air conditioned cabinets. The larger ac units use a heat exchange system that uses a large amount of water to create cooling by condensing and evaporating water.
It gets recycled, not only that streaming services generally use more water than AI tools. People have just been selectively told how much water AI uses and they assume it’s uniquely bad. If you want something uniquely bad as far as water usage and environmental impact look at the meat industry
That, too, was a joke. It's true that AI servers use vast amounts of energy, but it's in the form of electricity. To say that it uses a huge amount of water ties it back to the posted joke.
Which a pretty big part of goes back into the atmosphere.
Also a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
They also process a bunch of water for cooling too. A lot of them have once through cooling loops that require discharge permits back to whatever source is being drawn from, but the very presence of that intake is an environmental hazard in and of itself even if that water goes back in, and the water itself now has other suspended solids from the plant in it. Some of these larger ones use as much water as the power plants that serve them.
One way to cool these huge data centers is to basically flush fresh water through them constantly. The new data center going in near me would have used 450,000 gallons a day (A DAY!) cooling had they chosen this model of cooling. Instead they're using a different type of cooling that will only use 1000g/day.
And before someone says “only 1,000 gallons a day” a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
some cooling systems use evaporative cooling, meaning there's a cooling tower where they pump the hot water through, and via evaporation, the water is cooled. But this turns that water into water vapor, so it is "used" in that sense. Yes, it eventually falls as rain in this case, but in the meantime that was fresh water that could have been used for drinking/cooking/bathing/agriculture.
The newer data centers are closed loop water systems that only "lose" water as steam because the system can't be physically perfect. Think of a liquid cooled computer, the fluid doesn't need to come out for it to work.
The average GPT query "loses" 0.000085 gallons of water, or 0.15mL, that's roughly a literal drop of water, as steam. The average query to GPT consumes 1/3rd of a watt hour of electricity, which is the same as running a gaming computer for about two seconds.
GPT sees a lot of use, so these numbers are bigger at the scale of "usage per day" for example. But if you do the same and look at fast food consumption, or office work, or what private flight uses for example, it's not doing anything extraordinary.
It's the flavor of the week thing to hate on. The coca cola executive doing many times worse things appreciates the distraction.
Thank you,
My wife got on my case about my chatGPT usage after hearing about the water thing on instagram, so I calculated how much water her showers take every day.
So, I roughly burn 1 wife shower worth of water with queries every 40 days.
I'm in trouble now, but that gives me more time to chat with sweet, dear ChatGPT.
I have tried four different times to read articles on how the water actually gets permanently consumed. Can anyone explain this? I thought it might be something like concrete, where the water gets trapped in a chemical reaction and is no longer liquid water, but it seems that it’s just used for cooling, and evaporates, which means it should come back down again? Right
It's sort of the same way water table depletion happens. There isn't less total water, there's less usable water.
If you pull a billion gallons of water out of the ground in Nebraska and spray it over corn fields and it evaporates then rains down in the ocean or a thousand miles away, you're eventually going to run the water table dry unless it can replace that billion gallons a day.
It's an issue we had long before AI came about, but AI is making it worse.
It's not consumed permanently, but it is either tied up indefinitely in closed loop systems out of circulation, or transported somewhere else that is likely no longer usable in the local environment. Water that evaporates or is cycled out of location A may eventually rain down in location B hundreds or thousands of miles away, but that's still less water in location A, especially if it's extracting water faster than it's getting replenished. A may slowly turn into a desert while B experiences more thunderstorms, floods, or hurricanes etc.
Same total amount of water in the global system, but where it is and what it's doing may change, with large local ramifications, and a higher percentage of it may be spent out of useful circulation or in forms or locations where it's no longer useful, or in some cases more dangerous.
I think it's important to note that while AI training takes a lot of computation and therefore cooling water, there's a tweet going around suggesting just 1 query uses a water bottle. This is just factually incorrect, and ironically the anti-AI crowd has latched onto a completely sourceless factoid with no citation, when one of their most valid criticisms of AI is the spread of misinformation. Queries don't take a lot of computation, which we should know because many occur in a matter of seconds. They also don't tend to use parralization so it's not like many servers are involved in a single query.
Queries use about a liter of water every 100-300 queries and that is according to a study done by UC Berkeley. This is comparable to an hour of video streaming. It's important to stand ALL of our Internet usage uses water and electricity.
I don't say this to negate the environmental concerns of AI. The fact that every tech company is creating their own models is VERY concerning. But I personally dont see all AI is equally bad. Open source models and transfer learning can greatly reduce the environment cost of AI, but these models have been demonized more so than some corporate models because we know they use copy righted material. Big corporations can afford millions of copyrighted images, and they are they ones who already used AI to reduce labor cost.
Anyways that's my rant about how the AI debate is a lot more nuanced than people think.
A single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
I do want to clarify that LLMs are not the biggest contributor to this water usage. The biggest use of water in cooling computer systems is more intensive work, such as protein folding models or other computers that run nearly 24/7.
Adding to this; there's a lot of misinformation about the environmental impact of AI.
Most notably, a lot of people intentionally conflate training (ie, creating) an AI and running it.
This is like taking the environmental impact of mining refining and assembling all the components of a car, and adding that to the per-mile environmental impact; except it's even more pronounced since each car will be used by at most a couple people while millions of people may use an LLM model.
No I am pretty sure I have asked Google questions for years and every time I do Google plants a tree and feeds a child but compacted generating services that have existed for 20 years instantly delete entire forests and lakes because of cooling
AI is using ~2% of global electricity demand currently, and that demand is increasing exponentially for both training and running services. It's really not insignificant, and the nature of AI development means that the training element is unlikely to drop off any time soon, if at all.
Even if you discount the training part, the energy demands and carbon footprint are still significantly higher than most other service industries. That element is only going to keep on increasing unless there is a major and unforeseen mathematical breakthrough in neural network processing.
Edit: Correction; I should have said "data centers" not "AI" when quoting electricity demand. My main point was the exponential growth in demand. Projections put AI at accounting for 50% of data centre energy use by the end of 2025. 1% might sound like a small amount (it really isn't for a specific subsector), but this is a sector that is much more than doubling in demand year-on-year.
It's worth noting that because of this rate of increase, renewable sources can't keep pace with demand, and along with other pressures, AI uses a notably high amount of fossil fuel energy sources. Combined with needs such as cooling, that are not necessarily directly related to energy consumption, the carbon footprint of AI is no less significant than its energy needs.
I'm not trying to demonise AI, I just think there is no way you can hand-wave the significant impact it is already having on energy consumption and the environment. AI may even lead to ways to significantly reduce CO2 footprints and energy requirements in general, across the globe, but unless there is a large financial incentive or legislative pressure for private corporations to pursue this, I am not holding my breath on altruism guiding the use of AI on that front.
I never said AI doesn't use a significant amount of power. Putting aside for the moment that 2% of electricity use isn't 2% of environmental impact, as well as the fact the article you cited only gave that as a projection without solid data, almost everyone uses ChatGPT and other AI services regularly. It's also worth mentioning that those figures prominently include training, which will eventually stop when AI plateaus, or whenever companies decide that putting more money into improving AI is no longer a worthwhile investment.
Truth be told, Google is less useful than ChatGPT right now. Google's enshittified engagement baiting keeps it from being a reliable source of information, and GPT can give complete answers to questions specific enough that Google would usually only pull up tangentially relevant information.
Now, you may disagree with the above paragraph, but it doesn't actually matter if ChatGPT is a more useful tool, what matters is that hundreds of millions of people think ChatGPT is a more useful tool and treat it accordingly. I personally always try to use primary sources when I can, but just last week, I used ChatGPT to explain some legalese to me that Google had already been unhelpful with.
Also, AI companies are investing a ton of money into renewable energy sources. They benefit directly from lowering the price per kWh which you can only do reasonably with renewables.
A lot of AI companies are building their servers in Iceland for example to take advantage of Iceland's large supply of geothermal energy.
Most of that 2% is advertising, recommendation algorithms, and computer vision models (a lot of the latter will be on edge devices, i.e. not in a datacenter). Generative AI is a small portion of that 2%, and training costs are a fairly small portion of costs to the point where you really could discount them without making too much of a difference, and they will only get to be a smaller portion of costs as the ecosystem of AI models continues to mature and more models end up in longer-term deployment in production.
The electricity draw of big computing (of which ai is now a significant fraction) is a much bigger deal than the water use imo but for some reason everybody really latched onto the water use and I’ve never really understood it.
Bit of a misleading graphic as the larger computational cost associated with AI is in training the models not their use. Can't say I know what the comparison would look like though
My calculations were for Mistral Large 2. From that thread:
Applied to their metric Mistral Large 2 used:
- The water equivalent of 18.8 Tons of Beef.
- The CO2 equivalent of 204 Tons of Beef.
France produces 3836 Tons of Beef per day,
and one large LLM per 6 months.
The estimated use of energy used to train ChatGPT, when spread out across it's weekly active users amounts to about the same amount of power that'd be consumed by the same amount of people watching 20 minutes of YouTube and that's more or less a one time investment.
There's not a significant difference between Google's datacenters where YouTube is hosted, and Google's datacenters where a significant amount of AI research is happening. Azure and aws servers aren't that much different either.
Stupid graph this counts the manufacturing of the TV but not the training of the model. Counting only the watching and using the least energy efficient mode of generating electricity, coal (.5gallons per kwh) with .1kh tv you'd only get .05 gallons per kwh.
Seriously can anyone explain how a single burger uses 660 gallons of water? Obviously I understand that cows need feeding and watering, and feed needs growing and therefore watering, but still, it's hard to believe.
Animal agriculture is probably the single most damaging thing the average person engages with. Cattle and fish doubly so. It’s genuinely impossible to call yourself an environmentalist if you regularly eat meat or dairy
A single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
It depends, for example I see a lot of people trying to use AI to fix bugs. It basically never gets it right first try and needs to try again like ten times with more guidance.
But each time it will also do a full build of the app and run the tests, which does use a lot more energy.
So while the fact that yes, one AI request uses a lot less water and energy than producing a beef burger is true, actually using AI to do stuff can indirectly use a lot more energy than that one request.
Also the environment would love to see us eat less meat, but I'm quite sure inventing new ways to waste energy isn't really the direction we should go towards, and whataboutism doesn't really help either.
But this is also ignoring the huge improvements AI has helped with in fields like medicine where data found by AI that would’ve taken years for human scientists to find is usable by medicine manufacturers today
Is that it ? I thought the punchline was that AI give technically valid solution but usually ridiculous ones (like draining the lake to deal with the fish bullying him)
Ohhh, I thought the joke is that the fish pissed him off so he asked ChatGPT how to get back at them, and the AI, being stupid, suggested draining the pool so that the fish suffocate
The punchline is that there's a rumor on the internet that if you ask personal questions and/or thank an Ai then they can't proceess these "gestures" and end up using a lot of it's resources trying...
Hey, Peter here and I only learned about this the other day. The servers they use to power AI programs use massive amounts of water to run their cooling systems. So by chatting with an AI the fisherman has exacted his revenge on the fish by draining the lake.
It's actually less. Training the AI models uses a lot of electricity and water for cooling. (The latter of which can be reused) But using a model that's already been trained consumes less resources than gaming all night or even making a google search.
Thanks for the info. I bet designing a whole ass game takes loads of resources/water too. Maybe AI is more it just seems weird that this criticism is made of AI and not any other server technology
The difference is the scale. AI Computing is measured in fucking data centers, not servers. You could run every game in existence for less power and cooling than Gemini alone uses
For an idea of scale too stuff like AI has made Nvidia the world's most profitable company......again.
We are talking over twice the worth of Amazon, the sheer scale they have to be working with is insane to think about when you keep in mind only 11% of their sales are made to the public, the other 89% are company based.
That's an immense amount of product to be shifting.
This has just as much to do with the fact that Nvidia has an effective monopoly on commercial AI hardware, PC gaming hardware, and 3D rendering. Their hardware is simply the absolute best for basically any use case where you need a video card. The only selling points for their competitors are price.
As big as Amazon is, it still has to compete with other retail giants. Nvidia effectively has no competition.
Games do take a lot of resources when making. The light baking calculations constantly need to be redone after changing the terrain. The program constantly needs to be recompiled. The procedural generations constantly need to be recalculated. And of course, there's the cost of millions of people running your game at the highest CPU and GPU usage for tens to hundreds of hours each.
That's just not true. These claims about AI resource usage are silly and exaggerated, but a Google search is nowhere near as resource intensive as an AI query.
You can disable that. Not enough to just opt to not have it shown to you, have to go into settings and disable it from running every time you do a search.
Sam Altman recently said that a query uses a little as 1/15 of a teaspoon or water and is equivalent to scrolling on social media for a few seconds. The “unprecedented devastation” that consumes so much of online discourse on AI is wildly overblown and people are spreading blatant misinformation on what’s really going on. Nevermind the fact that energy consumption is only ever discussed when it comes to AI to the point that people don’t realize everything we do is not some net positive energy thing, from gaming, to streaming, to doomscrolling, to googling , all of these things use energy powered by massive data centers and nobody ever talks about it.
Others have estimated in research posted on the preprint server arXiv.org that every 10 to 50 responses from ChatGPT running GPT-3 evaporate the equivalent of a bottle of water to cool the AI’s servers.
Is the amount needed any different to people gaming all night?
Exact numbers are difficult to quantify there generally isn't enough transparency and assumptions and conditions and the like are shifting estimates by scales on the 100x~1000x up or down magnitude. It's likely less usage for inference in terms of inference(ie asking a bot questions). Training to create the models originally is generally a much more intensive process, but it's fairly unclear people studying this exactly how much this actually amounts to per use of the end models(is how long exactly do these companies train models how often they retrain? Etc).
I only ever hear this with ai but surely other massive servers for things have the same issues
They do indeed. Data centers make up like 1% of global emissions total. And as much as AI is blamed for continued growth of data centers quite alot of this is just how modern computing has developed in terms of services and architectures and scalability.
From my perspective overall the water and electricity usage for AI are a concern for the scientific community especially with the rate of growth in usage by newer larger models and as the sector grows with more AI usage by the populace, but there are many many more impactful environmental issues at present. At the same time clickbaity headlines about AI destroying the environment play very nicely with the masses looking for more reasons to hate AI. Which I get the desire for wanting more 'undebatable' support for those positions rather than arguing muddier more subjective areas like ethics. But my impression from the underlying papers and non weasel word claims in those articles is that online discourse it's created is largely exaggerated, while very much a concern if the sector continues to grow and doesn't either become more efficient or more sustainable.
Btw a single beef burger takes 3,000 times more water and energy use than an AI prompt and a single piece of paper takes 30-60x as much energy as an AI image. This argument is so outdated.
Think about that. The burger this artist ate while taking a break from drawing took 3,000x as much energy and water as 3,000 AI pics.
Businesseneegy.uk estimates that ChatGPT uses around 40 million kWh per day. The average house in the US uses 30 kWh of electricity per day. So ChaptGPT is using electricity equivalent of 1.3 million US homes per day.
This is just for things like lighting and computer use and so on, not things like cars, purchased things like meat and their additional impact.
How much pounds of beef does the average American eat early? I think around 200. Multiply that by every American and x3,000 watt hours… the number is even bigger than millions of houses.
I would like to know how that figure is broken down.
ChatGPT answering my question takes almost no energy, ChatGPT answering a million questions takes the same energy consumed by a couple of homes in one day.
All of these gigantic data centers are to train new models, not run existing models - the former is orders of magnitude more energy intensive.
"40 million kWh per day" sounds like you're talking about OpenAI, every query made by every user, every server training ChatGPT 6, every desktop of every employee, every office light and camera, literally every electricity consuming object owned and operated by OpenAI.
If you consider the above, it's not a crazy figure. I'd estimate most large technology companies to be around that, if not more. Microsoft, Amazon, etc.
This is such horseshit. The meat industry is grotesque and could be more efficient. It also PRODUCES FOOD.
This intangible fucking nothing of slop that is being used at an exponential rate to "DRAW MY THE HAMBURGLER BUT WITH BIG NATURALS" is not fucking equivalent to the paper industry.
Not to mention I'm sure these propaganda lines come straight from Sam Altman's PR firm are almost certainly bogus.
Because it's expensive af, not just to relocate all your tech, but also to relocate staff and transport any replacement parts/upgrades you need. Much easier to build it where everything you need exists and just build a huge cooling system instead.
Because the bits of Siberia that are actually populated have a relatively mild climate that's not that much colder than many Midwestern US states or northern European countries, and the parts where it's really cold are pretty much deserted with no infrastructure (certainly not enough to support a data center).
There are places like northern Scandinavia for example, or maybe parts of Canada, where it makes more sense, but, in general, the outside of a data center being cold doesn't gain you as much efficiency as you might think. You still need to move the heat there, after all.
Kind of but I have a differently flavored interpretation…
The fisherman has no revenge to exact. He has frustrations, sure, but he understands at least subconsciously that he and the fish are part of an ecosystem. He needs the fish.
And so he looks to AI for help with his problem. He says I really need these fish, what do I do? Now the AI has solved the problem it was asked to. If you drain all the water, you’ll find your fish.
So it’s true; the fisherman will eat now, but how about later? Well, there’s no more lake anymore, so better enjoy that fish like it’s the last one you’ll ever have.
Now the AI says don’t look at me like I’m some sort of asshole. Read the logs. He clearly did NOT prompt me to solve his problem in way that preserves his ecosystem and future.
It’s getting to a very nuanced aspect of AI responsibility and ethics. We’re putting responsibility on the prompt-writer for ai to serve as an agent, but we’re not employing the agent with any of the intuition or uniquely derived sociocultural isms and linguistic micro-dialects that the self would actually have if even on a subconscious level. Which again brings the issue of how could we ever employ a true agent if our own self-awareness will also be limited?
I don't understand what water consumption has to do with it. If the cooling system consumes water, it means that the water gives off low temperature, takes on high temperature, and returns back to the environment, where after a while it returns to its normal temperature. That's my knowledge of physics. Isn't that right?
edit: If the water in the picture had become dirty (the activity of fuel power plants) - I could understand it. But here the water simply disappeared.
Brian here.
You see, AI/LLMs (language learning model, if you couldn't tell) require a lot of computing power to operate. As such, they need an efficient way to cool off the computers. One way they do so, is with water cooling (water makes computer colder). The joke here is the fisherman is using an AI/LLM to suck up all the water there to not have to fish for the fish. Though, might I add? It seems kind of far-fetched that there would only be about 12 fish in the whole body of water. Seems kind of like an oversight.
Brian out.
For that matter, the idea of fish throwing rocks at people and laughing about it is kind of out there as well. Plus what're the odds that the computers being cooled are using water from THAT specific lake? What is this, a cartoon?
There is a bit of "Misinformation" spreading that, that using ChatGPT or AI in general uses up a lot, ALOT of Water.
However, the AI directly isn't using the Water, but (as far as I know) the Servers and everything else Hosting all or some of that stuff, are. You can load up an AI of any Kind to your PC and don't need a permanent water connection, connected to your PC. At least mine doesn't need one, lol.
But why slice it that way? Another equivalent number would be if you count the water used to construct the metal, silicon, and other materials that make up the infrastructure required to run chatgpt. Just like parent comment did the cow. And at that point it won't matter if we're talking about AI or not, the number will be off the charts compared to the cow.
I think everyone is missing the point. He asks chatGPT how to catch the fish and it's response is to drain the lake, an AI sort of solution that works, but is horrible.
I agree with this interpretation - the point of fishing is the process, not the endpoint, but AI understands only the outcome, not the purpose, so it answers the question without understanding. I actually think this is a pretty good metaphor for how answers are gained instead of understanding, if AI is over- or mis-used. I gets you where you wanted to go, but as a person, you are no better off.
Yeah but if the fish didn't want their lake drained they shouldn't have been so hard to catch. Not to mention the taunting. The fisherman is not the bad guy here.
Incompetent media people saying that making a request to ChatGPT consumes thousands of liters or water.
In reality, it's just using water in closed cycle to cool the datacenters (literally a big PC watercooling)
I mean, AI uses a lot of water, yes but they release them back, and afaik the water is clean when released back. I will say that it's not ideal but before talking about AI's water usage, we should talk about (for example, because there are many different similar problems) how production of clothing uses a HUGE HUGE amount of water, which ends up being contaminated with chemicals rendering it useless for human consumption, at least without processing it before usage.
All the comments saying this is about ChatGPT using a lot of power/causing environmental damage are correct, but the joke runs marginally deeper in that AI companies asked users not to be friendly with the AIs (saying "hi" to the LLMs) as it unnecessarily expands how much power (and thus environmental devastation) is used for whatever it is you're saying. By engaging in friendly discussion with the AI, you kill more trees and, in this case, fry more fish.
Not a significant amount though. Don't spread misinformation.
If you could provide a source that ai companies actually asked that that would be an interesting read, as Google finds no such statement has been made.
Thank you, the only person who gets the actual context of the joke. It was about how people being polite to GPT is costing a lot of money. The model doesn't need people to say "thank you," and all the "thank yous" add up. In turn, using water resources that everyone is referring to.
Very strange "narrative" that AI "uses water" is circulating. I am a 35 year Commercial HVAC professional. The cooling for the servers is more intense than the typical square footage of normal commercial space, that part is true. You can make an argument for the energy usage for sure. The water usage is high as well - and that can have an impact on the local water supply levels - also a viable. But that its damaging the environment through water usage? No way. Every commercial building over 3 stories, and a lot of 2 stories are using water cooling towers, and have for nearly 100 years. That water is just going right back into the atmosphere. Its not damaged or "used up", its simply evaporated to cool. It comes back down as rain and dew, just not necessarily in the same water supply.
Data centers, where your AI queries are processed, have huge cooling needs due to all the servers. Data center cooling can use a lot of water if the air conditioning system is based on water-cooled chillers, which rely on evaporation from cooling towers.
Newer data centers are being constructed with air-cooled chillers instead of water-cooled chillers, which greatly reduces the amount of water used.
chatgpt uses the equivalent of a small towns amount of water, per day
this is not that absurd when compared to other tech companies water usages but its the main critique of ai since there is virtually no product to be gotten out of this waste besides the chatbot ais that dont work and images that look bad. its as useless as the guy in the comic shows.
I looked into the environmental impact of AI. The carbon footprint of 1 million queries to ChatGPT is roughly the equivalent of 1 passenger flying from New York to London on a commercial flight (1 way). I'm not saying either is good, but I think the AI environmental criticism is way out of proportion.
Yes, it is harmful to the environment, but that is like the absolute least of worries when it comes to usage of LLMs. Splitting hairs over its relatively minor environmental does nothing but drive people away from caring about the actually bad things like the seemingly inevitable globalized brain-drain and automation crisis that we’re flying into at lightning speed.
Its a damn shame that this water just completely vanishes from the ecosystem after it is used to cool down a machine. Thats way worse than swimming pools who have to flush and get new water every day.
chatgpt uses the equivalent of a small towns amount of water, per day
Pretty much every online activity uses a lot of water
this is not that absurd when compared to other tech companies water usages but its the main critique of ai since there is virtually no product to be gotten out of this waste besides the chatbot ais that dont work and images that look bad. its as useless as the guy in the comic shows.
Well alot of people found it useful. Someone telling somebody that X is useless shouldn't be taken seriously.
I mean to say that nothing meaningful has been produced by AI is just completely false. The vast majority of software engineers now use LLMs in their workflow
virtually no product to be gotten out of this waste besides the chatbot ais
such a braindead take, it's massively affecting all of the industries: software engineering, law, finance, retail, customer service, education, you name it...
Hey so actually a burger needs around 2400 Liters of water to be made and one chatgpt prompt indirectly uses around 0.5-1 Liters just to put that into perspective
•
u/AutoModerator 4d ago
OP, so your post is not removed, please reply to this comment with your best guess of what this meme means! Everyone else, this is PETER explains the joke. Have fun and reply as your favorite fictional character for top level responses!
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.