
As a student, if you are not using ChatGPT these days you are a fool: it is such a convenient, quick, accurate tool that can help you with a wide range of tasks – such as writing articles, summarizing long texts, explaining complex Physics exercises, generating weekly grocery lists, or planning your next trip.
But have you ever wondered what it costs to power this efficiency?
“A lot of people think about the internet as just being in the cloud, but really it is taking up this huge space of these gigantic computers, which are data centers” – Julie Bolthouse, Director of Land use, Piedmont Environmental Council
As you know, generative AI is part of the tech industry, which is notoriously resource-intensive.

According to experts from the University of California, Riverside, a single conversation with ChatGPT – roughly 20 to 50 questions – running on its older model, GPT-3, is the equivalent to pouring out a half a liter bottle of fresh water. This may not sound like much, until you consider that the chatbot has more than 100 million active users, each engaging in multiple conversations daily, Forbes emphasized.
With GPT-4 being even more advanced, its power consumption and therefore water usage is likely even higher. However, tech companies have not disclosed precise or reliable information about the energy or water consumed by the latest AI models.
Why does AI consume so much water?
Every time we ask GenAI a question, a request is sent to a data center, a massive facility filled with AI chips capable of handling tasks that our phones and laptops’ hardware cannot. These data centers rely on two key resources to operate.
The first is electricity: it is estimated that a single AI query consumes 10 times more energy than a conventional search, and this estimate is derived from data pertaining AI models from 2023 which use significantly less energy than current models.
The second requirement is water, which serves 3 main scopes: electricity generation, server manufacturing, and data center cooling. AI chip production requires vast amounts of water which is difficult to recycle once used due to toxic chemical contamination. Plus, cooling systems must use fresh drinking-quality water to prevent server damage from impurities, much of which is permanently lost, requiring continuous replenishment.
“Data centres have always needed cooling because computers are essentially like radiators, with most of their energy converted into heat. However, the servers that are built for AI are processing huge amounts of data and have more power density and greater cooling demands.” – The Times

It has been calculated that, on average, these data centers use 1.14 million liters of water every day, which is the equivalent to roughly 7,000 10-minute showers. A study by UC Riverside revealed that Microsoft’s U.S. data centers can directly evaporate 700,000 liters of clean freshwater just to train the GPT-3 language model – and these figures have largely been kept a secret by the company. Finally, such research anticipates AI to have a level of water withdrawal (freshwater taken from ground or surface water sources, either permanently or temporarily, and conveyed to a place of use) of 4.2 to 6.6 billion cubic meters in 2027, exceeding half of the United Kingdom’s annual water consumption.
Although we lack precise data regarding the current electricity and water consumption of major AI companies, what is clear is that generative AI developments and usage are likely to keep growing in the coming decades, which means: more data centers, higher electricity demands, and increased water consumption which contribute to greater greenhouse gas emissions and further water resource depletion.

What can we do?
While some solutions are slowly starting to emerge to make AI more energy-efficient at a company level, I wish to focus on how we can approach this issue on a personal level, and I hope this article can provide some food for thought.
Firstly, think before you prompt: if a simple Google search can provide the answer, consider than instead of running an AI query.
But most importantly, be mindful of your usage: the key to addressing this issue is realizing that, as consumers, we can make a choice – just as someone might choose to be vegetarian or vegan due to environmental concerns, we can choose to limit or abstain from AI usage.
AI is an incredible tool, but it comes with hidden costs: be mindful!
By Ludovica Vittoria Barzaghi, Marketing Associate
REFERENCES
Criddle, C., & Bryan, K. (2024, February 25). AI Boom Sparks Concern over Big Tech’s Water Consumption. Www.ft.com. https://www.ft.com/content/6544119e-a511-4cfa-9243-13b8cf855c13
Gordon, C. (2024, February 25). AI Is Accelerating the Loss of Our Scarcest Natural Resource: Water. Forbes. https://www.forbes.com/sites/cindygordon/2024/02/25/ai-is-accelerating-the-loss-of-our-scarcest-natural-resource-water/
Gordon, C. (2024b, March 12). ChatGPT And Generative AI Innovations Are Creating Sustainability Havoc. Forbes. https://www.forbes.com/sites/cindygordon/2024/03/12/chatgpt-and-generative-ai-innovations-are-creating-sustainability-havoc/
Guerrini, F. (2023, April 14). AI’S Unsustainable Water Use: How Tech Giants Contribute To Global Water Shortages. Forbes. https://www.forbes.com/sites/federicoguerrini/2023/04/14/ais-unsustainable-water-use-how-tech-giants-contribute-to-global-water-shortages/
Li, P., Yang, J., Islam, M. A., & Ren, S. (2023). Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models. ArXiv (Cornell University). https://doi.org/10.48550/arxiv.2304.03271
Sellman, M., & Vaughan, A. (2024, October 4). “Thirsty” ChatGPT uses four times more water than previously thought. Thetimes.com; The Times. https://www.thetimes.com/uk/technology-uk/article/thirsty-chatgpt-uses-four-times-more-water-than-previously-thought-bc0pqswdr
Walther, C. C. (2024, November 12). Generative AI’s Impact On Climate Change: Benefits And Costs. Forbes. https://www.forbes.com/sites/corneliawalther/2024/11/12/generative-ais-impact-on-climate-change-benefits-and-costs/
Commenti