ChatGPT, alongside numerous artificial intelligence (AI) tools, has become increasingly integrated into our daily routines. Whether you rely on it for workflow, love it for recipes, or try to avoid it altogether, the environmental impact of AI has been taking its toll quietly in the background.
AI technology has been marketed as seamless and intangible, with everything relying on the “cloud” for storage and processing. This idea of AI remaining up in the air allows users to distance themselves from the material effects of technology on several environmental factors. To understand how AI impacts the environment, it is necessary to understand the computer processes behind every prompt.
ChatGPT-3 represents a large language model (LLM) that functions on a Generative Pretrained Transformer structure. To understand human inputs and generate responses to requests, the model must be ‘trained’ to recognise language patterns to better predict the next words in a text. This training requires an immense amount of resources to complete. According to a 2020 study by Strubell, Ganesh and McCallum, training a language model similar to ChatGPT-3 could produce roughly the same amount of carbon dioxide as five passenger vehicles throughout their entire operational lifespan. Roughly converted, that would equal 700,000 litres of water consumption. Once trained, LLM systems process prompts given to them by comparing the input to the massive quantity of data they ‘learned’. Through a combination of data comparison and statistical probability, the LLM produces an answer based on what it predicts to be correct. Those statistical calculations are what ChatGPT runs through when it ‘thinks’, but in this case, the abstract concept of thought is converted into computer calculations that require some form of electricity generation.
Both the human brain and computer systems require fuel and water to properly function, but the types and quantities of both factors are vastly different. The water consumption, or footprint, of AI can be divided into two categories. Indirect consumption refers to the manufacturing process of computer components, while direct consumption measures the water required for cooling systems at data centres. As the computers complete their vast amounts of calculations, their systems heat up, requiring a cooling system to prevent overheating. One of the most common coolant options for data centres is a water-cooled system, in which a fan blows air across a chilled water coil to provide air to computer processing chips. The water for these systems often comes from lakes and rivers, depleting the natural resources available to meet local demand. With some data centres consuming up to 5 million gallons—roughly 18,927,058 litres—daily, water consumption levels can pose a threat to the water supply of surrounding communities. That number on its own does not raise any alarms, but when considering the increasing number of data centres to accommodate the steeper increase of AI users worldwide, how do we track the damage?
A logical first step toward reducing the environmental impact would be to identify the most energy-intensive processes and determine ways to make them more efficient. This is where the virtual brick wall appears. Environmental reporting for AI is uncommon and impact assessment remains unreliable, with some surveys showing that less than one-third of data centres track their water usage or consider it a priority. The methods currently available mainly focus on energy consumption and carbon footprints of machine learning systems, which, while relevant, do not account for all aspects of AI’s environmental impact. Electricity generation for data centres remains primarily fossil-fuel based, which could be substituted with renewable energy sources to lower the environmental impact.
Waterless cooling technology exists but has not yet been widely implemented because it requires more physical space and can take longer to break even. Therein lies another question: should data centre owners focus more on the economic or the environmental costs of their businesses? As of now, the primary focus leans toward the economic side. ChatGPT can generate energy consumption reports with basic approximations of the energy consumed during a conversation. Several factors go into the estimation: length and output requirement of the conversation, location of the data centre, and even the type of processing chip used by the computer helping with the request. By asking the model to provide an ongoing energy consumption dashboard, users can be mindful of the real-world effects of their AI usage. Third-party programmes such as CodeCarbon, CarbonTracker, and Eco2AI can provide more realistic summaries, but they require some knowledge of computer science to be properly implemented.
This is not meant to discourage or speak poorly about those who use ChatGPT or other AI systems on a regular basis. In fact, it can be safe to assume that every person with access to technology and the Internet has used an AI tool at some point during their online presence. An increasing number of major corporations like Google, Apple, and Samsung have seamlessly integrated AI into their operating systems. Rather, it is important to remember that ecological literacy, or building awareness of the environmental consequences of using digital systems, can be a beneficial tool to incorporate into daily use.
As we navigate into an increasingly automated world, it should be recognised that automation does not equal less impact.
Kate Crawford says in Atlas of AI that understanding the environmental impact of AI is “vital at this moment in history, when the impacts of anthropogenic climate change are already well underway.” AI presents incredible opportunities for innovation across all fields, but while appreciating its possibilities, the negative outputs must be acknowledged as well.
Written by Hannah Schaffer, Edited by Alexandra Steinhoff
Photo Credit: Anna Házas & Mariami Gavashelishvili, 2026









