A new study by the nonprofit research institute, Epoch AI, has revealed that ChatGPT's energy consumption is much lower than previously estimated. The findings highlight that the chatbot's latest model, GPT-4o, consumes approximately 0.3 watt-hours per query—ten times less than the earlier widely cited estimate of 3 watt-hours12.
The latest findings from Epoch AI's study have debunked previous assumptions about ChatGPT's energy consumption. Using OpenAI's most recent default model, GPT-4o, researchers discovered that a typical query consumes only about 0.3 watt-hours of electricity123. This figure stands in stark contrast to earlier estimates, which suggested that each ChatGPT interaction required around 3 watt-hours of power4. The significant reduction in energy use can be attributed to advancements in AI technology and more accurate calculation methods that reflect current efficiency improvements1.
To put ChatGPT's energy consumption into perspective, it's helpful to compare it with other common activities. At 0.3 watt-hours per query, ChatGPT's energy use is now on par with a single Google search12. This is significantly less than many everyday tasks:
Boiling water for tea consumes about 100 watt-hours1
Watching TV for an hour uses approximately 120 watt-hours1
Charging an iPhone 15 daily for a year would require about 4.7 watt-hours per day3
The average U.S. household uses about 29 kilowatt-hours of electricity per day4
These comparisons highlight that while AI technology like ChatGPT does consume energy, its individual impact is relatively small compared to many routine activities. However, it's important to note that the cumulative effect of millions of daily queries still results in substantial energy consumption, emphasizing the need for continued efficiency improvements in AI systems35.
Several factors contribute to the improved energy efficiency of ChatGPT:
Implementation of more advanced hardware, including newer Nvidia H100 chips instead of older A100 models12
Optimization of AI systems, resulting in more realistic assumptions about output token length1
Updated calculation methods that better reflect current technology, assuming 70% server power utilization rather than peak usage1
Continuous refinement of computational models to reduce energy consumption without compromising performance34
These advancements have collectively led to a significant reduction in power requirements, challenging previous assumptions about AI's energy footprint and paving the way for more sustainable AI development.
The environmental impact and future implications of ChatGPT's energy consumption are complex and evolving. While recent studies suggest lower energy use than previously thought, the cumulative effect of AI systems remains significant:
Annual energy consumption for ChatGPT is projected to reach 226.8 GWh, enough to fully charge about 3.13 million electric vehicles1.
Scientists estimate that AI energy consumption could increase to up to 134 terawatt hours by 20272.
The International Energy Agency (IEA) reports that data centers and data transmission networks already account for 1-1.5% of global electricity use3.
Looking ahead:
AI infrastructure demands are growing, potentially doubling energy consumption by 2026 compared to 2023 levels4.
Development of more complex models and increasing global deployment of AI systems may offset efficiency gains5.
The AI industry is investing in energy-efficient hardware and renewable energy sources to mitigate environmental impacts2.
While individual ChatGPT queries may have a relatively small energy footprint, the scale of AI adoption necessitates ongoing monitoring and improvement of energy efficiency in the sector.