The next time you start a conversation with OpenAI’s ChatGPT, spare a thought for the bottle of water that interaction requires. It adds up, given that more than 100 million people use the chatbot each month.
Training natural language models such as ChatGPT, the wildly popular chatbot created by the start-up OpenAI, backed by
Microsoft
(ticker: MSFT), requires water to cool the data-center servers that run those programs. The same goes for Bard, developed by
Alphabet
‘s (GOOGL) Google.
Cooling towers, which use a lot of water, are the most efficient and widely used way of doing that in many areas, though air cooling, refrigerants, or a combination, are options as well. Servers also soak up lots of electric power—another category of water usage.
Solar and wind don’t consume water, but coal, gas, and nuclear energy use a lot.
The “enormous water footprint” of AI models has “remained under the radar,” say researchers from the University of California, Riverside, and the University of Texas at Arlington, in a paper awaiting peer review. “This is extremely concerning, as freshwater scarcity has become one of the most pressing challenges shared by all us.”
States across the western U.S. are wrestling with how to manage water shortages fueled by a two-decade long “megadrought.” Extreme heat, aging infrastructure, and chronic overuse from agriculture, manufacturing, people, and cities are making things worse.
Earlier this month, the Biden administration put forward a proposal that would allow the federal government to impose cuts on the amount of water Arizona, California, and Nevada take from the shrinking Colorado River.
“People need to realize that using machine learning models and training machine learning models isn’t water-free,” Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside, and a co-author of the paper, tells Barron’s. “They consume a finite resource.”
The authors write that a simple conversation with ChatGPT consumes 500 milliliters, or about 17 fluid ounces, of water. They warn these numbers “are likely to increase by multiple times for the newly launched GPT-4 that has a significantly larger model size.”
The researchers note water usage depends on when and where ChatGPT is used: Consumption is higher during the hotter parts of the day, when more water is needed to cool the data centers. Training GPT-3 in Microsoft’s U.S. data centers can consume 700,000 liters, or 184,920 gallons, of clean freshwater over 15 days, according to the paper.
That 700,000 liters, for context, is enough to produce 370
BMW
cars or 320
Tesla
electric vehicles, the paper says.
Ren says the number is fairly conservative because the authors don’t know when or where GPT-3 was trained. The number could be a few times higher than 700,000 liters if GPT-3 was trained in the summer in a hot part of the country, he says.
“To respond to the global water challenges, AI models can, and also should, take social responsibility and lead by example by addressing their own water footprint,” the researchers say.
James Rees, chief impact officer at Botanical Water Technologies, a company that captures water from fruit and vegetable processing, says technology companies “are becoming more proactive and they’re doing the right things around water stewardship. However, this technology, and the use of the technology, is advancing very quickly.”
He described the technology as having “a double-edged impact” in terms of water usage and their broader environmental impact. For the most part, he said, AI tools are being used for technological development, and can be used to find better ways to reduce carbon emissions, or conserve energy and water.
Large corporations. including Microsoft,
Meta Platforms
‘ (META) Facebook, and Google, have all pledged to be “water positive” and replenish more water than they use in their direct operations by 2030.
OpenAI didn’t respond to a request for comment.
In a statement emailed to Barron’s after publication, a Microsoft spokesperson said the company is tackling its water consumption in two ways: reducing its water use intensity—the amount it uses per megawatt of energy used for operations—and replenishing supplies in the water- stressed regions where it operates.
“As part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application,” the spokesperson said.
Google referred Barron’s to a blog post that describes what the company calls a climate-conscious approach to cooling its data centers and lays out its commitment to “champion responsible water use.”
According to the blog post, in 2021, the average Google data center consumed 450,000 gallons of water a day—about the same amount of water it would take to irrigate 17 acres of turf lawn grass once. That is also equivalent to the water needed to manufacture 160 pairs of jeans, including growing the cotton needed, the post says.
Google said that as climate change continues to exacerbate water challenges around the world, the company remains “committed to investing in technologies that reduce both energy and water consumption.”
Write to Lauren Foster at [email protected]
Read the full article here