AI is rapidly transforming the digital landscape, but this transformation comes with significant energy demands. A recent study highlights the potential for AI developed by Google to consume as much electricity as an entire country by 2027.
A study by PhD candidate Alex De Vries has revealed startling projections about the energy consumption of Google’s AI systems. According to De Vries, the exponential growth of AI could result in it consuming between 85 and 134 terawatt-hours (TWh) of electricity annually by 2027. This figure is comparable to the annual energy usage of a small country, such as Ireland or the Netherlands.
However, this estimation is contingent on several factors remaining constant. De Vries explains that the current rate of AI growth, the availability of AI chips, and the continuous operation of servers at total capacity are all critical elements that could drive such high energy consumption.
“The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year), which is a significant increase compared to its historical AI-related energy consumption,” he stated. This situation equates to “half a percent of our total global electricity consumption.”
The study also quotes the chair of Alphabet, Google’s parent company, who pointed out that interacting with generative AI might “likely cost 10 times more than a standard keyword search.” Google plans to integrate generative AI into its search technology and workspace products, mirroring moves by other tech giants like Microsoft, who have already started embedding AI models across their software.
In 2021, Google’s total electricity usage, spanning offices and data centres, was already significant at approximately 18.3 TWh. Even before the AI boom sparked by technologies like ChatGPT, AI accounted for 10-15% of Google’s electricity consumption. With Google handling up to 9 billion searches daily, De Vries calculates an average energy consumption of 6.9-8.9 Wh per request. This estimate aligns with the consumption observed in Hugging Face’s BLOOM model, which averaged 3.96 Wh per request.
De Vries emphasizes the potential for optimization in AI systems to mitigate these energy demands. “In addition to hardware efficiency improvements, innovations in model architectures and algorithms could help to mitigate or even reduce AI-related electricity consumption in the long term,” the paper notes. For instance, Google’s Generalist Language Model (GLaM) was trained on seven times the parameters of GPT-3 but required 2.8 times less energy.
De Vries suggests examining NVIDIA’s sales, given its dominant market share of approximately 95% in 2023, for a more accurate projection of global AI-related electricity consumption.
“If operating at full capacity (i.e., 6.5 kW for NVIDIA’s DGX A100 servers and 10.2 kW for DGX H100 servers), [NVIDIA’s] servers would have a combined power demand of 650–1,020 MW,” he states.
With NVIDIA surpassing analyst expectations in early 2023, the AI server supply chain is poised to meet the projected growth.