AI could soon use as much power as the Netherlands

Hungry AI operations could increase global carbon emissions

AI could soon use as much power as the Netherlands

Image:
AI could soon use as much power as the Netherlands

AI has emerged as a rapidly growing digital trend over the past 12 months and if its adoption becomes more widespread, it could potentially use as much energy as entire countries.

That's according to a recent study published in the journal Joule, in which data scientist Alex de Vries at Vrije Universiteit Amsterdam in the Netherlands estimates that by 2027, AI server farms could consume 85 to 134 terawatt-hours (TWh) of energy annually.

That figure is roughly equivalent to the annual energy consumption of country like the Netherlands, and represents 0.5% of the world's current global electricity usage.

"We don't have to completely blow this out of proportion," de Vries told The New York Times. "But at the same time, the numbers that I write down — they are not small."

Given the expanding demand for AI services, it is highly probable that the associated energy consumption will grow substantially in the coming years.

In 2022, data centres accounted for 1%—1.3% of the world's total electricity consumption. Cryptocurrency mining represented an additional 0.4%.

The electricity required for AI operations will likely to increase global carbon emissions until more renewable power sources are introduced.

Generative AI is becoming increasingly accessible to the public, with chatbots such as OpenAI's ChatGPT being widely used by students, coders, designers and writers. These chatbots are built upon AI models that are trained on extensive datasets, which requires a lot of energy.

In his research paper, de Vries says that Hugging Face, a US-based AI company, recorded its multilingual text-generation AI model as consuming 433 megawatt-hours (MWh) during its training process. That is sufficient to power 40 average US homes for an entire year.

Tools like ChatGPT also require a substantial amount of computational power and, consequently, energy resources when generating text in response to prompts.

de Vries suggests that running ChatGPT could potentially consume 564 MWh of electricity on a daily basis. Furthermore, he estimates that if Google were to employ AI for its approximately nine billion daily searches, it would require 29.2 TWh of power annually. That is comparable to the electricity consumption of Ireland and nearly double Google's total energy consumption of 15.4 TWh in 2020.

"We recognise training large models can be energy-intensive and is one of the reasons we are constantly working to improve efficiencies," a spokesperson for OpenAI told New Scientist.

"We give considerable thought about the best use of our computing power."

Thomas Wolf, the co-founder of Hugging Face, suggests that there are indications of smaller AI models now approaching the capabilities of larger ones, which could lead to substantial energy savings.

Models like Mistral 7B and Meta's Llama 2, which are 10 to 100 times smaller than GPT-4 are capable of performing many of the same tasks and functions, he said.

"Not everyone needs GPT4 for everything, just like you don't need a Ferrari to go to work."