AI scaling

The story of AI progress is dominated by scale. Training AI systems with more compute, power and data has consistently led to better performance. Epoch tracks the scale-up of the resources used to train AI systems, and what this means for capabilities and the future of AI. Epoch's research covers training compute trends, data availability, scaling laws, hardware constraints, and the question of whether scaling can continue through the end of the decade.

Filter

Type
0 results