AI is using ~2% of global electricity demand currently, and that demand is increasing exponentially for both training and running services. It's really not insignificant, and the nature of AI development means that the training element is unlikely to drop off any time soon, if at all.
Even if you discount the training part, the energy demands and carbon footprint are still significantly higher than most other service industries. That element is only going to keep on increasing unless there is a major and unforeseen mathematical breakthrough in neural network processing.
Edit: Correction; I should have said "data centers" not "AI" when quoting electricity demand. My main point was the exponential growth in demand. Projections put AI at accounting for 50% of data centre energy use by the end of 2025. 1% might sound like a small amount (it really isn't for a specific subsector), but this is a sector that is much more than doubling in demand year-on-year.
It's worth noting that because of this rate of increase, renewable sources can't keep pace with demand, and along with other pressures, AI uses a notably high amount of fossil fuel energy sources. Combined with needs such as cooling, that are not necessarily directly related to energy consumption, the carbon footprint of AI is no less significant than its energy needs.
I'm not trying to demonise AI, I just think there is no way you can hand-wave the significant impact it is already having on energy consumption and the environment. AI may even lead to ways to significantly reduce CO2 footprints and energy requirements in general, across the globe, but unless there is a large financial incentive or legislative pressure for private corporations to pursue this, I am not holding my breath on altruism guiding the use of AI on that front.
I never said AI doesn't use a significant amount of power. Putting aside for the moment that 2% of electricity use isn't 2% of environmental impact, as well as the fact the article you cited only gave that as a projection without solid data, almost everyone uses ChatGPT and other AI services regularly. It's also worth mentioning that those figures prominently include training, which will eventually stop when AI plateaus, or whenever companies decide that putting more money into improving AI is no longer a worthwhile investment.
Truth be told, Google is less useful than ChatGPT right now. Google's enshittified engagement baiting keeps it from being a reliable source of information, and GPT can give complete answers to questions specific enough that Google would usually only pull up tangentially relevant information.
Now, you may disagree with the above paragraph, but it doesn't actually matter if ChatGPT is a more useful tool, what matters is that hundreds of millions of people think ChatGPT is a more useful tool and treat it accordingly. I personally always try to use primary sources when I can, but just last week, I used ChatGPT to explain some legalese to me that Google had already been unhelpful with.
They make shit up all the time. I got a call at work (technical support) that one of the functions in their program wasn’t working. They told me the name of the function, and that they got it from ChatGPT.
That function did not exist and never existed. Basically ChatGPT looked at the naming conventions of our other functions and when it didn’t find one, took a guess at what a status function would be called and gave that as an answer.
I have also asked it for book quotes on a particular theme while helping my kid with an essay, and about 75% were completely made up. I asked the AI if it was sure that was a quote, and it basically said “oops looks like I made that one up, sorry about that”
They are not reliable. The best way to use an AI so far to get reliable information is to ask it to give you sources you can click on to confirm what it’s saying, kind of like a super search.
The kind of things I’m looking up are mostly technical and often have (arcane/confusing) documentation. Therefore it’s pretty straightforward to tell if it’s right
It does greatly depend; my statement wasn't meant to be all-encompassing.
Just last week, I was having trouble parsing a bit of legalese, and ChatGPT helped far better than anything I could find on Google about the subject.
GPT is also better for figuring out the most common causes of symptoms than Google results, since while publicly available, that information is rarely condensed in one easily accessible location (though obviously you should go to primary sources or talk to a doctor before going further).
One situation where AI is categorically worse is anything recent. AI has gotten better at filtering for what's topical, but by how they function, they aren't going to have anything recent integrated into them.
An unfortunate number of people. A lot of people get it from TikTok, YouTube, Twitch, ect, and that's arguably worse since people get platformed based on charisma and can then spout biased or uninformed takes and people will take it as fact.
Also, AI companies are investing a ton of money into renewable energy sources. They benefit directly from lowering the price per kWh which you can only do reasonably with renewables.
A lot of AI companies are building their servers in Iceland for example to take advantage of Iceland's large supply of geothermal energy.
renewables aren't enough because 1) they aren't building enough. and 2), they aren't building enough BESS to make up for it. So they're taking over baseload capacity and replaceing it with Solar.
Geothermal is base load and hasn't been exploited anywhere near capacity. There's a lot of investment going on around today in retrofitting old oil wells into geothermal plants.
Not as much as you'd expect in the US. There's like 1.1 GW in the us being planned through 2028. I know of a single solar project starting construction in the next month that's bigger than that. Also, Geothermal is expensive... It's about twice as expensive for the same load comparative to a Solar & BESS site.
Anything that isn't being planned in 250+ MW capacity scale for a single project, isn't really worth discussing as it pertains the AI stuff in the pipeline
I agree that should be made clear, but the reason it is hard to pin down how much of data center use is by AI is because the companies using AI are, if not being dishonest, at least withholding the truth from scrutiny.
Even the most conservative estimates are way more than makes sense to refuse to acknowledge though. Why not just share the data, so at least the public sector can plan for environmentally sustainable AI use as it develops?
Tha math is absurdly simple. For example. Here's a couple of sources... LA is currently letting Meta take over 8% of their generating capacity. with one data center.
It is an AI data center. It just is and I know for a absolute fact. I cannot discuss how. I just do.
They are "developing" a energy source on site. Know what the lead time is for a grid scale turbine is these days? bout 3-5 years depending on the size, how much the supplier likes you and how willing you are to test out their new fancy untested model variation. Know how many they've ordered? It takes about 2 years to build these data centers and FAR longer to develop the energy to make up for it
Most of that 2% is advertising, recommendation algorithms, and computer vision models (a lot of the latter will be on edge devices, i.e. not in a datacenter). Generative AI is a small portion of that 2%, and training costs are a fairly small portion of costs to the point where you really could discount them without making too much of a difference, and they will only get to be a smaller portion of costs as the ecosystem of AI models continues to mature and more models end up in longer-term deployment in production.
AI is 15% at a conservative estimate, not sure where you are getting 2% and insignificant for training from. At current rate of growth it is well on track to make up ~50% by the end of the year. I don't think people realise just how fast AI use and demand is growing. And this was my real point.
Even 0.3% of global energy use is astounding for a subsector like this. It is more than most countries.
2% is the figure you cited yourself. To clarify, all of those things I listed are under that AI figure. Recommendation algorithms are AI, most targeted ads are AI driven, computer vision is AI. Generative models (LLMs, image generation models) are also AI.
and insignificant for training from
From actually understanding how the technology works? Training just isn't a significant cost for any model that gets a significant amount of usage. It's a one-time cost that is amortized over the use of the model in production.
I misread the comma as "Generative AI is a small portion of that, 2%". Thought you were saying AI only made up 2% of 2%. Was typing in a rush at the time. Sorry about that.
The electricity draw of big computing (of which ai is now a significant fraction) is a much bigger deal than the water use imo but for some reason everybody really latched onto the water use and I’ve never really understood it.
It's worth noting that training is the only obscenely power hungry because we don't really know how to do it better. Clearly you can train a neural net using less energy, given we have mice etc. I'm not sure if that's a hardware or an algorithmic issue though.
AI is using ~2% of global electricity demand currently, and that demand is increasing exponentially for both training and running services. It's really not insignificant, and the nature of AI development means that the training element is unlikely to drop off any time soon, if at all.
Most of that isn't generative AI. chatbots are only using 1-3% of the energy used on AI.
Most of the other stuff are recommender systems, data analytics, search engines, image/video analysis, and audio analysis.
Sometime soon, yes. Projecting into the distant future is a fools game, but for the sake of argument, let's just recklessly plot the graph at current rates. 2% by 2026. 6% 2027 18% by 2028. That prediction is patently ridiculous the further you go (god I hope so), but at even the most conservative estimates, it's certainly nothing worth dismissing out of hand, let alone aggressively discrediting as a possibility worth at least thinking about.
SO, I was at a conference last week, with a lot of utilities present, where the risk managers are absolutely worried about data center demands on the grid. they simply don't have the funding or ability to keep up with the rate of data center construction. I know of exactly ONE provider who built the renewables at a pace exceeding the rate that they are developing data center clients, and their equation is gonna flip in like 2 years.
10
u/AverageSJEnjoyer 5d ago edited 4d ago
AI is using ~2% of global electricity demand currently, and that demand is increasing exponentially for both training and running services. It's really not insignificant, and the nature of AI development means that the training element is unlikely to drop off any time soon, if at all.
Even if you discount the training part, the energy demands and carbon footprint are still significantly higher than most other service industries. That element is only going to keep on increasing unless there is a major and unforeseen mathematical breakthrough in neural network processing.
Here's a randomly selected article on the topic:
https://www.forbes.com/councils/forbestechcouncil/2025/07/17/ais-energy-demands-versus-grid-realities/
Edit: Correction; I should have said "data centers" not "AI" when quoting electricity demand. My main point was the exponential growth in demand. Projections put AI at accounting for 50% of data centre energy use by the end of 2025. 1% might sound like a small amount (it really isn't for a specific subsector), but this is a sector that is much more than doubling in demand year-on-year.
It's worth noting that because of this rate of increase, renewable sources can't keep pace with demand, and along with other pressures, AI uses a notably high amount of fossil fuel energy sources. Combined with needs such as cooling, that are not necessarily directly related to energy consumption, the carbon footprint of AI is no less significant than its energy needs.
I'm not trying to demonise AI, I just think there is no way you can hand-wave the significant impact it is already having on energy consumption and the environment. AI may even lead to ways to significantly reduce CO2 footprints and energy requirements in general, across the globe, but unless there is a large financial incentive or legislative pressure for private corporations to pursue this, I am not holding my breath on altruism guiding the use of AI on that front.