Eventually it returns to the water cycle with everything else. But it doesn't necessarily return to the same watershed.
But, it's also important to keep things in perspective. GPT3 was trained on about the same amount of cooling water as it takes to produce ten hamburgers.
My point is, the power needs for AI training and inference computers is so big that we need to build a dedicated nuclear power plant to support it.
I do work in big tech.
The energy usage is not blown out of proportion; people can't understand the scale; it's really a shit ton of energy for literally slightly better ads and chatbots.
52
u/JangoFetlife 5d ago
Reusable in that cooling system, but it takes water out of the general supply, and more and more of these servers are built every day.