AI energy use

AI systems consume enormous and rapidly growing amounts of energy. As of 2025, the power required to train frontier models has been doubling annually, and new data centers are placing significant demands on power grids. The energy efficiency of AI hardware has been improving by roughly 40% each year, but the growth in compute demand has been outpacing it. Epoch tracks the power requirements of frontier AI, how efficiency gains compare to the growth in compute usage, and what the balance between energy demand and supply means for AI development.

Filter

Type
0 results
OpenAI Stargate: where the US sites stand
Report
Apr. 17, 2026
OpenAI Stargate: where the US sites stand

The $500 billion AI data center initiative is projected to exceed 9 gigawatts of capacity by 2029, with 0.3 gigawatts already operational in Abilene and six more US sites under active construction.

By Elliot Stewart and Ben Cottier

What does the war in Iran mean for AI?
Newsletter
Apr. 10, 2026
What does the war in Iran mean for AI?

A prolonged Hormuz crisis probably won't derail the compute buildout, but it could slow data center expansion and disrupt Gulf investment flows into AI.

By Josh You

Global AI power capacity is now comparable to peak power usage of New York State
Data Insight
Jan. 16, 2026
Global AI power capacity is now comparable to peak power usage of New York State

By Yafah Edelman, Josh You, Venkat Somala, and Luke Emberson

GPUs account for about 40% of power usage in AI data centers
Data Insight
Dec. 18, 2025
GPUs account for about 40% of power usage in AI data centers

By Luke Emberson and Ben Cottier

Is almost everyone wrong about America’s AI power problem?
Newsletter
Dec. 17, 2025
Is almost everyone wrong about America’s AI power problem?

Why power is less of a bottleneck than you think.

By Anson Ho, Yafah Edelman, Josh You, and Jean-Stanislas Denain

Microsoft’s Fairwater datacenter will use more power than Los Angeles
Data Insight
Nov. 26, 2025
Microsoft’s Fairwater datacenter will use more power than Los Angeles

By Jaeho Lee

What you need to know about AI data centers
Topic Overview
Nov. 4, 2025
What you need to know about AI data centers

AI companies are planning a buildout of data centers that will rank among the largest infrastructure projects in history. We examine their power demands, what makes AI data centers special, and what all this means for AI policy and the future of AI.

By Ben Cottier and Yafah Edelman

Could decentralized training solve AI’s power problem?
Report
Oct. 28, 2025
Could decentralized training solve AI’s power problem?

We illustrate a decentralized 10 GW training run across a dozen sites spanning thousands of kilometers. Developers are likely to scale datacenters to multi-gigawatt levels before adopting decentralized training.

By Jaime Sevilla and Anton Troynikov

What will AI look like in 2030?
Report
Sep. 16, 2025
What will AI look like in 2030?

If scaling persists to 2030, AI investments will reach hundreds of billions of dollars and require gigawatts of power. Benchmarks suggest AI could improve productivity in valuable areas such as scientific R&D.

By David Owen

What did it take to train Grok 4?
Data Insight
Sep. 12, 2025
What did it take to train Grok 4?

By James Sanders, Luke Emberson, and Yafah Edelman

How much power will frontier AI training demand in 2030?
Paper
Aug. 11, 2025
How much power will frontier AI training demand in 2030?

The power required to train the largest frontier models is growing by more than 2x per year, and is on trend to reaching multiple gigawatts by 2030.

By Josh You and David Owen

Newsletter
Jul. 2, 2025
How big could an “AI Manhattan Project” get?

An AI Manhattan Project could accelerate compute scaling by two years.

By Arden Berg and Anson Ho

Power requirements of leading AI supercomputers have doubled every 13 months
Data Insight
Jun. 5, 2025
Power requirements of leading AI supercomputers have doubled every 13 months

By Konstantin F. Pilz, Robi Rahman, James Sanders, Luke Emberson, and Lennart Heim

Newsletter
Feb. 7, 2025
How much energy does ChatGPT use?

This Gradient Updates issue explores how much energy ChatGPT uses per query, revealing it's 10x less than common estimates.

By Josh You

Leading ML hardware becomes 40% more energy-efficient each year
Data Insight
Oct. 23, 2024
Leading ML hardware becomes 40% more energy-efficient each year

By Robi Rahman

The power required to train frontier AI models is doubling annually
Data Insight
Sep. 19, 2024
The power required to train frontier AI models is doubling annually

By Luke Emberson and Robi Rahman

Can AI scaling continue through 2030?
Report
Aug. 20, 2024
Can AI scaling continue through 2030?

We investigate the scalability of AI training runs. We identify electric power, chip manufacturing, data and latency as constraints. We conclude that 2e29 FLOP training runs will likely be feasible by 2030.

By Jaime Sevilla, Tamay Besiroglu, Ben Cottier, Josh You, Edu Roldán, Pablo Villalobos, and Ege Erdil

Limits to the energy efficiency of CMOS microprocessors
Paper
Dec. 15, 2023
Limits to the energy efficiency of CMOS microprocessors

How far can the energy efficiency of CMOS microprocessors be pushed before we hit physical limits? Using a simple model, we find that there is room for a further 50 to 1000x improvement in energy efficiency.

By Anson Ho, Ege Erdil, and Tamay Besiroglu