Machine Learning Trends

Our ML Trends dashboard offers curated key numbers, visualizations, and insights that showcase the significant growth and impact of artificial intelligence.

Last updated on Dec 9, 2025

Display growth values in:

The Epoch Capabilities index

Confident
+15.5 ECI per year

Compute stock growth

Likely
199.5 x/year

FLOP/s per dollar

Confident
23.4 x/year

Training compute

Confident
100000.0 x/year

Algorithmic progress

Likely
1000.0 %/year

Largest AI data center

Likely
500,000 H100e

Model Performance

LLM inference prices are falling rapidly

Likely
1.0e+40 x

The Epoch Capabilities index

Confident
+15.5 ECI per year

Benchmarking Hub

Our database of benchmark results, featuring the performance of leading AI models on challenging tasks. It includes results from benchmarks evaluated internally by Epoch AI as well as data collected from external sources. Explore trends across time, by benchmark, or by model.

Read more

SOTA vs. consumer hardware models

Likely
8 months

LLM context windows

Confident
1.0e+30 x

AI Companies

Compute stock growth

Likely
199.5 x

OpenAI revenue growth

Confident
1584.9 x

AI Companies Hub

Our database of AI company data, with data on revenue, funding, staff, and compute for many of the key players in frontier AI.

Read more

Frontier inference compute allocation

Likely
30 %

Frontier AI investment

Likely
$97 billion

Hardware Trends

FLOP/s per dollar

Confident
23.4 x

Memory bandwidth

Confident
19.1 x

Trends in Machine Learning Hardware

FLOP/s performance in 47 ML hardware accelerators doubled every 2.3 years. Switching from FP32 to tensor-FP16 led to a further 10x performance increase. Memory capacity and bandwidth doubled every 4 years.

Read more Dataset

Adoption of Numerical Formats

Confident
3-4 years

FLOP/s per watt

Confident
21.9 x

Training runs

Training compute

Confident
100000.0 x

Algorithmic progress

Likely
1000.0 %

Most compute used in a training run

Plausible
5e26 FLOP

Training compute of frontier AI models grows by 4-5x per year

Our expanded AI model database shows that the compute used to train recent models grew 4-5x yearly from 2010 to May 2024. We find similar growth in frontier models, recent large language models, and models from leading companies.

Read more Dataset

Training cost

Likely
3162.3 x

Training hardware cost

Likely
398.1 x

Data Centers

Largest AI data center

Likely
500,000 H100e

Build time

Likely
2.1 years

Frontier Data Centers

Open database of AI data centers using satellite and permit data to show compute, power use, and construction timelines.

Read more

Cost per gigawatt

Likely
$30 billion

Country with the most computing power

Likely
United States (75% globally)

Acknowledgements

We thank Tom Davidson, Lukas Finnveden, Charlie Giattino, Zach Stein-Perlman, Misha Yagudin, Jai Vipra, Patrick Levermore, Carl Shulman, Ben Bucknall and Daniel Kokotajlo for their feedback.

Several people have contributed to the design and maintenance of this dashboard, including Jaime Sevilla, Pablo Villalobos, Anson Ho, Tamay Besiroglu, Ege Erdil, Ben Cottier, Matthew Barnett, David Owen, Robi Rahman, Lennart Heim, Marius Hobbhahn, David Atkinson, Keith Wynroe, Christopher Phenicie, Nicole Maug, Aleksandar Kostovic, Alex Haase, Robert Sandler, Edu Roldan, Andrew Lucas and Yafah Edelman.

Use this work

Epoch AI’s work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons Attribution license.

Cite this work as

Epoch AI (2023), "Key Trends and Figures in Machine Learning". Published online at epoch.ai. Retrieved from: 'https://epoch.ai/trends' [online resource]

BibTeX citation

  @misc{epoch2023aitrends,
    title="Key Trends and Figures in Machine Learning",
    author={{Epoch AI}},
    year=2023,
    url={https://epoch.ai/trends},
    note={Accessed: }
  }

If you spot an error or would like to provide feedback, please reach out at .