Skip to main contentSkip to navigationSkip to navigation
AI programs can seem incorporeal, but they are powered by a network of servers in data centres.
AI programs can seem incorporeal, but they are powered by a network of servers in data centres. Photograph: Pitinan Piyavatin/Alamy
AI programs can seem incorporeal, but they are powered by a network of servers in data centres. Photograph: Pitinan Piyavatin/Alamy

As the AI industry booms, what toll will it take on the environment?

This article is more than 10 months old

Tech companies remain secretive over the amount of energy and water it takes to train their complex programs and models

One question that ChatGPT can’t quite answer: how much energy do you consume?

“As an AI language model, I don’t have a physical presence or directly consume energy,” it’ll say, or: “The energy consumption associated with my operations is primarily related to the servers and infrastructure used to host and run the model.”

Google’s Bard is even more audacious. “My carbon footprint is zero,” it claims. Asked about the energy that is consumed in its creation and training, it responds: “not publicly known”.

AI programs can seem incorporeal. But they are powered by networks of servers in data centers around the world, which require large amounts of energy to power and large volumes of water to keep cool.

Because AI programs are so complex, they require more energy than other forms of computing. But the trouble is – it’s incredibly hard to nail down exactly how much.

As they compete to build ever-more sophisticated AI models, companies like OpenAI – which created ChatGPT – Google and Microsoft will not disclose just how much electricity and water it takes to train and run their AI models, what sources of energy power their data centers, or even where some of their data centers are.

Meta, the parent company of Facebook, for example, last year unveiled that it was building what it believed to be the world’s fastest supercomputer, called the AI Research SuperCluster (RSC). But it wouldn’t reveal where the supercomputer was located or how it was being powered.

Now, as the tech industry rushes to incorporate generative AI into virtually everything – from email and search to food delivery apps and mental health services – industry experts and researchers warn that the technology’s unchecked growth could come at a significant environmental cost.

“This exponential use of AI brings with it the need for more and more energy,” said Sasha Luccioni, the climate lead for the AI company Hugging Face. “And yet we’re seeing this shift of people using generative AI models just because they feel like they should, without sustainability being taken into account.”

Amazon data centers are seen in Manassas, Virginia. Photograph: Shuran Huang/For The Guardian

Luccioni is one of a handful of researchers who have tried to assess the emissions generated in the creation of specific AI models.

In a research paper that has not yet been peer-reviewed, she and her co-authors tallied the amount of energy used to train Hugging Face’s own large language model, Bloom, on a supercomputer; the energy used to manufacture the supercomputer’s hardware and maintain its infrastructure; and the electricity used to run the program once it launched. They found that it generated about 50 metric tons of carbon dioxide emissions, the equivalent of an individual taking about 60 flights between London and New York.

Bloom’s energy footprint is lower than those of other generative AI programs, Luccioni and her team estimate, since Bloom’s supercomputers are powered by nuclear energy, which doesn’t produce carbon emissions. By contrast, limited publicly available data suggests about 500 metric tonnes of CO2 were produced just in the training of ChatGPT’s GPT3 model – the equivalent of over a million miles driven by average gasoline-powered cars, the researchers noted.

“For ChatGPT’s latest model, GPT4, [OpenAI] hasn’t said anything about either how long it’s been trained, where it’s trained, or anything at all about the data they’re using,” said Luccioni. “So essentially, it means it’s impossible to estimate emissions.”

Meanwhile, newer AI models are getting bigger – and more energy intensive. Bigger models require the use of more and more powerful graphics processing units (GPUs), and take longer to train – using up more resources and energy, Luccioni said.

Even more unclear is the amount of water consumed in the creation and use of various AI models. Data centers use water in evaporative cooling systems to keep equipment from overheating. One non-peer-reviewed study, led by researchers at UC Riverside, estimates that training GPT3 in Microsoft’s state-of-the-art US data centers could potentially have consumed 700,000 liters (184,920.45 gallons) of freshwater.

In the absence of accurate, public data, the researchers had to assume the “water use effectiveness”, or the ratio of energy a data centre uses and the water used to keep it cooled and functioning, based on Microsoft’s self-reported average.

The actual number of liters used could vary significantly based on where and when exactly GPT-3 was trained – in searing Arizona, lots of water would be needed to keep servers from overheating, whereas in Wyoming, a centre might use less water. The design of the specific data centers could also wildly affect numbers. Rather than using water-intensive evaporative cooling systems, a center might use traditional air conditioning – which uses less water, but more electricity.

Google became the first tech giant to publicize its water usage worldwide, but provided average figures that concealed important details about the local impacts of its data centers. After a protracted legal battle with the Oregonian, the city of Dalles, Oregon, released data showing that Google data centers used a quarter of the town’s water supply.

Because an AI project’s water use effectiveness could be used to guess at its compute capacity, companies want to keep their water use a secret, said Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside. “They want to give us as little information as possible,” he said.

In general, companies have tended to build data centers where power is cheap. As big tech firms like Google and Microsoft strive for a net-zero emissions target, they may be especially motivated to build in areas where solar or wind power are plentiful – like Arizona – but water is scarce.

Meta and OpenAI did not respond to the Guardian’s requests for comment. Google and Microsoft declined to provide an on-the-record response.

When top executives from leading AI companies called for regulation to prevent the “existential risk” posed by AI, it spurred speculation on the threats that superintelligence posed to society. But researchers warned that one of the more immediate, pertinent risks was environmental.

If companies are more transparent about the natural resources used and carbon emissions released in the creation and use of AI models, they could help open up discussions about when and how to strategically use generative AI, said Luccioni. It may be worth the environmental cost to use generative AI tech in cancer treatment, but a waste to use it in other cases.

And yet, generative AI has become a fixation. “There’s this idea that your company is passé if you’re not using it,” said Luccioni.

A couple of months ago, OpenAI offered paid access to incorporate ChatGPT into their apps, and companies including Instacart, the online grocery delivery company, are using the feature to customize grocery lists and ingredient recommendations. And last month, Google announced that it would be incorporating generative AI into gmail and search – using exponentially more complex and energy-intensive technology to accomplish essentially the same tasks. Companies have suggested using similar tools for bank fraud detection, dispute statistical models that are already pretty good at detection.

“It’s frustrating because actually there are so many low-impact, efficient AI approaches and methods that people have developed over the years, but people want to use generative AI for everything,” said Luccioni. “It’s like using a microscope to hammer in a nail – it might do the job but that’s not really what this tool is meant for.”

Most viewed

Most viewed