Environmental Impact of AI
We know there are growing questions about AI’s environmental impact—and we welcome them. As AI becomes increasingly embedded in everyday tools and systems, it’s essential we pursue progress responsibly.
⚡ What's the actual energy usage of ChatGPT?
Some claims circulating online suggest that a single ChatGPT query consumes up to 3 watt-hours of electricity. But those figures are often based on speculation and outdated assumptions.
In contrast, independent analysis by Epoch AI found that typical ChatGPT queries using GPT-4o likely consume ~0.3 watt-hours—10x less than earlier estimates. That’s roughly equivalent to running a laptop or LED lightbulb for a few minutes. “Even for a heavy chat user, the energy cost of ChatGPT will be a small fraction of the overall electricity consumption of a developed-country resident.” — Epoch AI
🔬 What do the researchers say?
The University of Michigan’s ML Energy Initiative has been at the forefront of measuring actual AI energy usage. Their team developed a tool called ZeusMonitor to track real-time GPU power consumption and built the ML.Energy Leaderboard to benchmark various models. Their findings? Many open-source LLMs use far less energy per query than is often reported. While proprietary models like GPT-4 are not yet benchmarked on the leaderboard, the data points toward an improving trend in energy efficiency across the board.
⚙️ What’s driving these gains?
AI efficiency has improved dramatically over the past decade:
- GPU innovations (like Nvidia’s Hopper architecture) have made chips 10–15x more efficient than a few years ago.
- Software optimization has multiplied chip performance without requiring new hardware.
- Techniques like quantization and pruning are helping shrink models while maintaining quality.
- Tools like Perseus are being developed to orchestrate training across thousands of GPUs with minimal wasted energy—cutting total power usage by up to 30%.
🌱 Where does OpenAI stand?
While proprietary energy benchmarks for models like GPT-4 aren’t yet published, we are actively working to reduce energy usage across all stages of AI development—from training to inference.
We're also:
- Investing in hardware and software efficiency improvements.
- Collaborating with researchers and policymakers to develop better industry standards.
- Advocating for credible benchmarks and open research, like the work done by the ML Energy Initiative.
We take this seriously. Our mission is to ensure that AGI benefits all of humanity—and that includes minimizing our environmental footprint. We’ll continue to engage transparently, learn from independent researchers, and invest in technologies that make AI more sustainable.