Google and Microsoft announced significant changes to their search engines while spending heavily on developing or purchasing generative AI tools to provide users with a richer, more accurate experience.
However, these new technologies optimism could be masking a dirty little secret. The race to construct high-performance, AI-powered search engines will almost certainly necessitate a significant increase in processing power and a massive increase in the quantity of energy used by tech businesses and the amount of carbon they generate.
The training of ChatGPT-based GPT-3 is estimated to have consumed 1,287 MWh and produced emissions of more than 550 tons of carbon dioxide, which is equivalent to one person making 550 roundtrips between New York and San Francisco, according to an analysis conducted by independent researchers. Unfortunately, neither OpenAI nor Google have publicly stated the computing costs of their products.
“There are already huge resources involved in indexing and searching internet content, but the incorporation of AI requires a different kind of firepower,” Alan Woodward, professor of cybersecurity at the University of Surrey in the UK, told Wired.
While internet uses accounts for about 4% of global greenhouse gas emissions, the power required to train a single artificial intelligence can produce hundreds of thousands of pounds of carbon emissions.
According to analysts, the amount of computer power required from businesses like Google and Microsoft to combine AI with the volume of search engine requests might increase by up to five times. Additionally, as the number of computers rises, so will greenhouse gas emissions.
“It requires processing power as well as storage and efficient search,” Woodward added.
“Every time we see a step change in online processing, we see significant increases in the power and cooling resources required by large processing centers. I think this could be such a step.”
Additionally, more data centers will be needed to store data for the new search engines. According to Martin Bouchard, the CEO of data center startup QScale, “at least four or five times more computation per search” will be required due to AI.
Moving data centers to cleaner energy sources and redesigning neural networks to become more effective could both reduce the environmental impact and energy cost of integrating AI into search. In addition, this would decrease the so-called “inference time,” or the amount of computing power needed for an algorithm to work on new data.
However, the question here is whether all of the additional computer work and trouble is necessary for what might only be marginal increases in search accuracy.
Nafise Sadat Moosavi, a lecturer in natural language processing at the University of Sheffield who works on sustainability in natural language processing, believes that while it is vital to focus on the quantity of energy and carbon generated by LLMs, some perspective is necessary.
“It’s great that this actually works for end users,” Moosavi says. “Because previous large language models weren’t accessible to everybody.”