ChatGPT, a popular chatbot developed by OpenAI, has caused a lot of controversy due to its great effort. Reports indicate that ChatGPT can use more than 500,000 kilowatt hours of electricity to meet approximately 200 million requests per day. To put this in perspective, the average US household uses approximately 29 kilowatt hours per day; This means ChatGPT uses more than 17,000 times the daily energy of the average household.
This concern is further heightened when considering the broader implications of integrating generative AI into various applications. For example, if Google included AI in its search, it could use approximately 29 billion kilowatt hours per year; This is more than the annual electricity consumption of the whole of Kenya, Guatemala and Croatia.
The energy-intensive nature of intellectual skills is significant in their large energy consumption. Each AI server uses as much electricity as the combined electricity consumption of more than ten homes in the UK; This points to a rapid increase in electricity consumption as artificial intelligence expands.
Estimating the overall benefit of the AI ??industry is difficult due to the changing nature of AI operations and the lack of transparency in technology companies’ use of services. But according to calculations by data scientist Alex de Vries, the specialty could use between 85 and 134 terawatt hours of energy per year by 2027.
This forecast will account for a significant portion of global energy consumption, possibly reaching 0.5%. To put this in perspective, some of the world’s largest energy consumers, such as Samsung, Google, and Microsoft, currently use tens of terawatt hours of energy per year.
As the AI ??industry continues to evolve, it is important to address the impact of energy use on the environment. Efforts to develop more efficient artificial intelligence models and be more transparent about energy consumption are important steps towards reducing these problems.