Integrating AI into search engines requires five times the computational power, along with a massive carbon footprint.
ChatGPT has sparked a race among the world’s largest tech companies to develop AI. Microsoft kicked off this competition with a $10 billion investment in OpenAI to integrate this chatbot into its Bing search engine and Edge browser.
The search giant Google followed suit by announcing its own AI chatbot named Bard. Meanwhile, the Chinese company Baidu has indicated that it is testing an AI technology called Ernie Bot.
However, the excitement surrounding these new tools may be masking a serious issue. The race to build high-performance search engines supported by AI will necessitate a similar upgrade in computational power.
This, in turn, means a significant increase in the amount of energy required to operate these machines and the carbon emissions they produce.
Chatbots like ChatGPT are driving the AI integration race among search giants. (Image: Wired).
The Massive Carbon Emissions the Earth Must Handle
“There have been vast resources dedicated to indexing and searching content on the Internet. However, the addition of AI requires even more. Artificial intelligence demands processing power as well as storage and search capabilities to be more effective. Each time we see a shift in online processing, we discover a significant increase in the power and cooling resources needed by large processing centers,” said Alan Woodward, a cybersecurity professor at the University of Surrey in the UK.
Both ChatGPT and Bard are built on large language models (LLMs) that analyze and compute connections across massive datasets. This is why these chatbots tend to be developed by companies with substantial resources.
“Training these models requires an enormous amount of power to perform the calculations. Right now, only Big Tech companies have the resources to do so,” explained Carlos Gomez-Rodriguez, a computer scientist at the University of Coruna in Spain.
AI technology requires processing power, storage, and search efficiency. (Image: Wired).
Neither OpenAI nor Google disclosed the specific costs associated with operating their AI products. Nevertheless, estimates from third-party researchers indicate that training the GPT-3 model, which powers ChatGPT, consumes about 1,287 MWh and results in equivalent emissions of over 550 tons of CO2.
For comparison, this emission figure is equivalent to a person taking 550 round trips between New York and San Francisco.
“This number isn’t alarming, but we must also consider that the publishers not only have to train the model but also operate it to serve millions of users,” Gomez-Rodriguez noted.
Martin Bouchard, co-founder of the Canadian data center company QScale, believes that based on his reading of the search engine documentation from Microsoft and Google, the integration of AI will require at least “four to five times the computational power for each search result.”
Bouchard further noted that ChatGPT has stopped updating its knowledge of the world since the end of 2021 as part of efforts to reduce computational demands.
ChatGPT has ceased updating its knowledge of the world since the end of 2021, as part of efforts to cut down on computational requirements. (Image: Phuong Lam).
“If they plan to retrain a new model and add more parameters along with a larger volume of content, that would require a completely different scale. This will demand significant investment in hardware. The current data centers and infrastructure we have will not be able to cope with the AI race. The workload is too much,” Bouchard analyzed.
Solutions from Tech Giants
According to the International Energy Agency (IEA), data centers currently account for about 1% of global greenhouse gas emissions. The AI integration race will likely increase demand for cloud computing, despite previous promises from search engines to reduce their contributions to global warming.
“AI certainly cannot be worse than transportation or the textile industry. However, this technology can still contribute significantly to the Earth’s emissions,” Gomez-Rodriguez told Wired.
To mitigate the emissions issue, Microsoft has committed to achieving negative carbon emissions by 2050. In a blog post, the company stated “it has paid 15 suppliers across 26 projects worldwide to remove 1.3 million tons of carbon.”
Meanwhile, Google has also announced plans to eliminate carbon from its operations and value chain by 2030.
Both Microsoft and Google have committed to reducing carbon emissions to negative levels in the future. (Image: Phuong Lam).
According to Wired, the environmental impact and energy costs of integrating AI into search engines could be alleviated if data centers transition to using clean energy sources.
Additionally, another option could be to redesign neural networks – a computing system with interconnected nodes functioning like neurons in the human brain for greater efficiency, reducing what is known as “inference time” – the computational power needed for an algorithm tasked with new operations.
“We need to research ways to reduce the inference time required for such large models. Now is a good time to focus on the efficiency aspects of the models,” argued Nafise Sadat Moosavi, a lecturer in natural language processing at the University of Sheffield.