Data Insight
Sep. 12, 2025

What did it take to train Grok 4?

By James Sanders, Luke Emberson, and Yafah Edelman

Training today’s leading AI models involves a lot of energy, emissions, water, and money. Consider Grok 4: The training compute cost half a billion U.S. dollars, and required enough energy to power a town of 4,000 Americans. This came with a large environmental footprint, emitting as much as a Boeing aircraft over three years, and requiring over 1.2 billion liters of water for cooling. That’s enough to fill 300 Olympic-sized swimming pools.

These numbers do not even account for the costs of human labor, or the compute costs of running experiments or serving Grok 4 to users, both of which can be very significant. Needless to say, current frontier AI is a very expensive and resource-intensive endeavor.

Epoch's work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons BY license.

Learn more about this graph

Training today’s frontier models requires substantial investments in the form of electricity, emissions, water, and money. We estimate the resources needed to train Grok 4 and contextualize them against other values. We estimate that training Grok 4 required 310 GWh of electricity, cost $490 million, used about 750 million liters of water, and emitted the equivalent of 150,000 tons of CO2.

Analysis

Assumptions