The latest research indicates that ChatGPT requires less power than previous estimates showed. According to Epoch AI research, a standard ChatGPT request uses 0.3 watt-hours of power, which proves less than the commonly reported 3 watt-hours figure and matches the power usage of common household devices.
Joshua You conducted the study as a data analyst to show that earlier power usage estimates from OpenAI research were outdated because they used older, less efficient chip technology. According to You’s interview with TechCrunch, “AI’s energy consumption stands out but remains insignificant when compared to residential heating or car operation.”
The total energy consumption by AI systems is increasing because corporations maintain new data facilities to meet escalating usage requirements. The Rand report indicates that advanced AI model training in 2030 will need power equivalent to eight functioning nuclear reactors. OpenAI dedicates billions of dollars to construct AI infrastructure, which enables their ChatGPT tool and other related services.

Some features of ChatGPT, such as image generation and processing extensive documents, exceed the estimated 0.3 watt-hour energy consumption. OpenAI continues to develop thinking models that need additional power for their responses. The introduction of efficient models such as o3-mini does not seem to reduce the growing need for AI processing capabilities.
Users who wish to minimize their energy consumption should use smaller AI models or restrict their AI usage according to you. According to him, you should consider utilizing GPT-4o-mini models and restrict data-intensive operations for energy efficiency. Advancing AI adoption requires organizations to find balanced efficiency versus performance metrics to control environmental impacts.