Your ChatGPT Query Also Consumes Water… Did You Know? 

1 min read

Artificial intelligence (AI) has changed our lives in many ways: it helps us at work, entertains us and even facilitates daily tasks. But behind all this technological progress, there is an environmental cost that often goes unnoticed. A clear example is the recent boom in AI filters, such as the Studio Ghibli style in ChatGPT, which in a few days consumed millions of liters of water. How is this possible? Let’s break it down.

AI Water Consumption: A Hidden Problem

To function, AI needs huge data centers that process information non-stop. These servers generate a lot of heat and must be cooled with water. According to the National Autonomous University of Mexico (UNAM), every time we use ChatGPT, up to 500 milliliters of water can be wasted. It seems little, but when adding millions of interactions per day, consumption skyrockets.

One shocking case was the Studio Ghibli-style image craze: in just one week, approximately 216 million liters of water were used to generate these illustrations. To understand it better, imagine 100 people consuming 380 liters of water a day: it would take 15 years to use the same amount.

The problem is that this is not going to decrease. In fact, it is estimated that by 2027, water consumption due to AI could reach up to 6.6 billion cubic meters, according to the OECD. This not only reduces the availability of the resource, but also exacerbates the environmental impact of the technology.

AI’s High Energy Consumption and Carbon Footprint

Water is not the only problem. AI also requires a lot of electricity, which generates carbon emissions. According to the International Energy Agency (IEA), global electricity demand will rise by 5% by 2025, and much of that increase is due to the growth of AI.

Data centers already account for 1 to 1.3% of global electricity consumption, but are expected to reach 4% by the end of the decade. To put it in perspective, training a single AI model can use as much electricity as a small city and emit hundreds of tons of CO2.

A study by Goldman Sachs Research estimates that AI energy consumption will increase by 15% between 2023 and 2030, which is equivalent to needing 47 gigawatts more energy. This is a huge challenge for global efforts to reduce carbon emissions.

What Can We Do? Possible Solutions

To prevent AI from becoming an even bigger environmental problem, several organizations have proposed solutions:

  • Require companies to report how much water and energy they consume.
  • Create more efficient algorithms that use fewer resources.
  • Build green data centers with renewable energy sources.
  • Establish specific environmental regulations for AI.

Technological advancement does not have to be at odds with caring for the planet. If the right regulations are in place and transparency is encouraged, artificial intelligence can continue to evolve without compromising our natural resources. It is time to become aware and act before it is too late.

Leave a Reply

Your email address will not be published.

Previous Story

Managing a Rental in San Diego? Here’s Why You Shouldn’t Do It Alone

Next Story

President and CEO of Monarch Global Strategies – Michael C. Camuñez

Latest from Blog

Skip to content