Data Insight
Jan. 15, 2025

Frontier open models may surpass 1e26 FLOP of training compute before 2026

By Luke Emberson

The Biden Administration’s diffusion framework places restrictions on closed-weight models if their training compute surpasses either 1026 FLOP or the training compute of the largest open model. Historical trends suggest that the largest open model will surpass 1026 FLOP by November, 2025, and grow at close to 5x per year thereafter.

There is an additional reason to expect a large open model before 2026: Mark Zuckerberg indicated in October 2024 that Llama 4 models were already being trained on a cluster “larger than 100k H100s”. In the same statement, it is strongly implied that these models will continue to be released with open weights. Models trained at this scale are very likely to surpass 1026 FLOP, and appear to be planned for release in 2025.

Epoch's work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons BY license.

Learn more about this graph

We estimate the release date of the first open-weight model trained with over 1026 FLOP of training compute, by extrapolating historic trends. Our methodology suggests that the frontier open weight model scales in compute by around 4.7x per year (90% confidence interval: 3.6 – 6.1), and we project that this trend will surpass 1e26 FLOPs in November, 2025 (90% confidence interval: August 2025 – November 2026). This analysis does not incorporate specific information about the timing of expected large open models, such as Llama 4.

Data

Analysis

Explore this data