09.18.24
Roughly a quarter of Americans have used ChatGPT since the chatbot’s 2022 release, according to the Pew Research Center — and every query exacts a cost.
Chatbots use an immense amount of power to respond to user questions, and simply keeping the bot’s servers cool enough to function in data centers takes a toll on the environment. While the exact burden is nearly impossible to quantify, The Washington Post worked with researchers at the University of California, Riverside to understand how much water and power OpenAI’s ChatGPT, using the GPT-4 language model released in March 2023, consumes to write the average 100-word email.
Each prompt on ChatGPT flows through a server that runs thousands of calculations to determine the best words to use in a response.
In completing those calculations, these servers, typically housed in data centers, generate heat. Often, water systems are used to cool the equipment and keep it functioning. Water transports the heat generated in the data centers into cooling towers to help it escape the building, similar to how the human body uses sweat to keep cool, according to Shaolei Ren, an associate professor at UC Riverside.
Where electricity is cheaper, or water comparatively scarce, electricity is often used to cool these warehouses with large units resembling air-conditioners, he said. That means the amount of water and electricity an individual query requires can depend on a data center’s location and vary widely.
Even in ideal conditions, data centers are often among the heaviest users of water in the towns where they are located, environmental advocates said. But data centers with electrical cooling systems also are raising concerns by driving up residents’ power bills and taxing the electric grid.
|