The pace of large-scale model releases is accelerating

In 2017, only two models exceeded 1023 FLOP in training compute. By 2020, this grew to four models; by 2022, there were 32, and by 2024, there were 174 models known to exceed 1023 FLOP in our database, and 99 more with unconfirmed training compute that likely exceed 1023 FLOP. As AI investment increases and training hardware becomes more cost-effective, models at this scale come within reach of more and more developers.

Published

June 19, 2024

Last updated

February 21, 2025

Epoch’s work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons BY license.