It is no secret that AI usage is on the rise, but this alarming trend will damage the environment through AI’s extensive water and energy demands. If this persists, data centers’ consumption of resources will ultimately affect younger generations, including students at State High.
Before assessing the impact of AI, one must understand what it truly is and why it requires so much energy. AI is a “catch-all term for a group of technologies that can process information and, at least superficially, mimic human thinking,” according to the United Nations Environment Programme. Specifically, a generative AI system learns to generate outputs that look like the data it was trained on.
The main difference between generative AI tools, such as OpenAI’s ChatGPT, and older models is that ChatGPT is far larger and more complex, with billions of parameters, and has been trained on a massive amount of publicly available internet text, an enormous dataset.
Generative AI tools, such as ChatGPT, are trained and deployed in data centers, which are large, often sprawling over tens or hundreds of thousands of square feet; “temperature-controlled building[s] that house computing infrastructure, such as servers, data storage drives, and network equipment,” according to MIT News. However, the rapid development and deployment of generative AI models produce devastating environmental repercussions, including increased water consumption and electricity demand.
Water Consumption
Large quantities of water are needed to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can strain municipal water supplies and disrupt local ecosystems. Chilled water is used to cool a data center by absorbing heat from computing equipment.
It has been estimated that, for each kilowatt hour of energy a data center consumes, it would need two liters of water for cooling. This demand was explained by Noman Bashir, the lead author of the 2024 impact paper, “The Climate and Sustainability Implications of Generative AI,” to MIT News. For context, a kilowatt-hour is a unit of energy that measures the amount of electricity a device uses over time; it is equal to 1,000 watts of power used for one hour.
Additionally, large data centers can consume up to 5 million gallons of water per day, equivalent to the water usage of a town with a population of 10,000 to 50,000 people, according to the Environmental and Energy Study Institute (EESI).
“For a tool such as AI, even though it can in some cases be helpful, and that’s a whole other debate…it’s kind of just a useless and unnecessary way to waste that much water, especially when it’s every single search and every time you open an app, it goes by really fast,” Kazie Dingwell, senior and president of Environmental Club, said.
According to scientists at the University of California, Riverside, each AI-generated, 100-word email is estimated to use roughly one bottle of water (or 519 milliliters); i.e., one email is worth one bottle of water. Although one bottle of water doesn’t sound like much, it adds up when, at any given moment, billions of users around the world are entering prompts into systems like ChatGPT.
Additionally, the strain on freshwater resources to supply sufficient water for data centers has a direct impact on communities. Despite increasing droughts and water shortages reducing water availability, developers are increasingly tapping into surface and underground reservoirs to cool data centers. According to the American Society of Civil Engineers, data centers can use numerous water supply sources, including potable (drinkable) water. This water comes from municipal resources, meaning the water used to absorb heat from computing equipment is the same water people in nearby communities could have been drinking.
Energy Consumption
Data centers consume a substantial amount of electricity as well. According to MIT News, “in a 2021 research paper, scientists from Google and the University of California at Berkeley estimated the training process alone consumed 1,287 megawatt hours of electricity (enough to power about 120 average U.S. homes for a year), generating about 552 tons of carbon dioxide.”
“AI use is completely taking major steps back on any progress that might have been made,” Dingwell added.
Furthermore, using ChatGPT is more harmful than using a search engine like Google. According to the International Energy Agency (IEA), a single ChatGPT request requires ten times the electricity as a typical Google search. Yet, deploying these models in real-world applications, enabling millions to use generative AI in their daily lives, and then fine-tuning the models to improve their performance, continues to draw large amounts of energy long after a model has been developed.
Similarly, data centers are increasing in size, frequency, and power requirements. “Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023,” according to MIT News. This is driven by the growth of artificial intelligence; constant new models, more comprehensive capabilities, and increasing complexity call for more power, accelerating the energy demand.
“The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants,” Bashir said.
“I think the main issue with this is that people don’t really know about it, so people just use it, and then it burns up energy really fast,” Dingwell said. “Some people just don’t really care, don’t really focus, or don’t really know what to do about it.”
“I think that we live in a little bit of a bubble here, right? There are a lot of things that we do that we are not on the front lines of the consequences of those things, and that’s true for a lot of people across our country…I don’t think that we really understand the costs,” Kim Li Kimel, the Driver’s Safety teacher and the Environmental Club advisor, said. “If you put something out of sight and out of mind, chances are that it is still impacting you in very real ways and you aren’t going to recognize it until it’s too late.”
In conclusion, although data centers have been around for decades, the rise of generative AI has dramatically increased the pace of data center construction, prompting a surge in environmental consequences due to the rapid consumption of water and electricity. Less drinkable water, increased carbon emissions and the environmental degradation that occurs as a result of these consequences will negatively affect both the planet and humans, including students at State High.
“I think that the right thing to do is [to] think about the power of partial solutions. So what small thing can I do that isn’t going to necessarily inherently save the world, but it’s a small manageable thing that I can do that just makes me more aware,” Kimel advised. “There’s a lot of value in us just being aware of our human potential, but our human impact.”
So, next time students at State High are having trouble with homework or want a quick answer to a question, they should stop to weigh the consequences and consider the statistics presented. Are faster, larger, smarter generative AI models worth the resource strain? Is the destruction of our planet worth the increase in efficiency? These questions should also be considered by the large tech companies developing new AI models and enabling the construction of data centers, as they are the ones behind all the major decisions. Still, younger generations, like the students at State High, make a difference, whether they realize it or not, and can help the planet by limiting generative AI usage and advocating for increased awareness and consumption reform.
