Almost half of large-scale models have published, downloadable weights
59 large-scale models with known compute have downloadable weights. Most of these have a training compute between 1023 and 1024 FLOP, which is less compute than the largest proprietary models. The developers that have released the largest downloadable models today are Meta and the Technology Innovation Institute.
Epoch’s work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons BY license.