Luke Emberson is a researcher on Epoch AI's data team. He focuses on tracking and forecasting model capabilities through benchmarks and the Epoch Capabilities Index, and coordinates Epoch AI's Data Insight publications.
luke@epoch.ai
Filter
Topic
ChipsData centersCapabilitiesLeading companiesFinancesEnergyGeopoliticsAdoption and useOpen modelsRoboticsSoftware progressInference
By Luke Emberson
0 results
Data Insight
Apr. 14, 2026
Five hyperscalers now own over two-thirds of global AI compute
By Luke Emberson, Josh You, and Venkat Somala
Data Insight
Apr. 7, 2026
Google controls the most AI computing power, driven by its custom TPUs
By Luke Emberson, Josh You, and Venkat Somala
Data Insight
Mar. 24, 2026
Total AI chip memory bandwidth has grown 4.1x per year, now reaching 70 million terabytes per second
By Luke Emberson
Data Insight
Feb. 19, 2026
Anthropic could surpass OpenAI in annualized revenue by mid-2026
By Luke Emberson and Yafah Edelman
Data Insight
Feb. 4, 2026
Compute accounts for the majority of expenses of AI companies
By Luke Emberson and Yafah Edelman
Data Insight
Jan. 23, 2026
Benchmark scores are well correlated, even across domains
By Luke Emberson and Yafah Edelman
Data Insight
Jan. 16, 2026
Global AI power capacity is now comparable to peak power usage of New York State
By Yafah Edelman, Josh You, Venkat Somala, and Luke Emberson
Data Insight
Jan. 9, 2026
Global AI computing capacity is doubling every 7 months
By Josh You, Venkat Somala, Yafah Edelman, and Luke Emberson
Data Insight
Jan. 2, 2026
Chinese AI models have lagged the US frontier by 7 months on average since 2023
By Luke Emberson
Data Insight
Dec. 18, 2025
GPUs account for about 40% of power usage in AI data centers
By Luke Emberson and Ben Cottier
Data Insight
Nov. 6, 2025
Epoch’s Capabilities Index stitches together benchmarks across a wide range of difficulties
By Jaeho Lee and Luke Emberson
Data Insight
Oct. 30, 2025
Open-weight models lag state-of-the-art by around 3 months on average
By Luke Emberson
Data Insight
Sep. 30, 2025
AI capabilities have steadily improved over the past year
By Luke Emberson
Data Insight
Sep. 12, 2025
What did it take to train Grok 4?
By James Sanders, Luke Emberson, and Yafah Edelman
Data Insight
Aug. 29, 2025
GPT-5 and GPT-4 were both major leaps in benchmarks from the previous generation
By Luke Emberson and Josh You
Data Insight
Aug. 15, 2025
Frontier AI performance becomes accessible on consumer hardware within a year
By Venkat Somala and Luke Emberson
Data Insight
Aug. 8, 2025
Compute is not a bottleneck for robotic manipulation
By Ben Cottier, Scott Longwell, James Sanders, David Owen, Yafah Edelman, and Luke Emberson
Data Insight
Jul. 25, 2025
Frontier training runs will likely stop getting longer by around 2027
By Luke Emberson and Yafah Edelman
Data Insight
Jun. 5, 2025
Acquisition costs of leading AI supercomputers have doubled every 13 months
By Konstantin F. Pilz, Robi Rahman, James Sanders, Luke Emberson, and Lennart Heim
Data Insight
Jun. 5, 2025
The US hosts the majority of GPU cluster performance, followed by China
By Konstantin F. Pilz, Robi Rahman, James Sanders, Luke Emberson, and Lennart Heim
Data Insight
Jun. 5, 2025
Private-sector companies own a dominant share of GPU clusters
By Konstantin F. Pilz, Robi Rahman, James Sanders, Luke Emberson, and Lennart Heim
Data Insight
Jun. 5, 2025
Power requirements of leading AI supercomputers have doubled every 13 months
By Konstantin F. Pilz, Robi Rahman, James Sanders, Luke Emberson, and Lennart Heim
Data Insight
May 28, 2025
Widespread adoption of new numeric formats took 3-4 years in past cycles
By Venkat Somala and Luke Emberson
Data Insight
Updated Jun. 5, 2025
The computational performance of leading AI supercomputers has doubled every nine months
By Konstantin F. Pilz, Robi Rahman, James Sanders, Luke Emberson, and Lennart Heim
Data Insight
Apr. 17, 2025
LLM responses to benchmark questions are getting longer over time
By Luke Emberson, Ben Cottier, Josh You, Tom Adamczewski, and Jean-Stanislas Denain
Data Insight
Apr. 3, 2025
The combined revenues of leading AI companies grew by over 9x in 2023-2024
By Ben Snodin, David Owen, and Luke Emberson
Data Insight
Mar. 5, 2025
Leading AI chip designs are used for around four years in frontier training
By Luke Emberson, Ben Snodin, and David Owen
Data Insight
Feb. 13, 2025
The stock of computing power from NVIDIA chips is doubling every 10 months
By Luke Emberson and David Owen
Data Insight
Updated Jun. 6, 2025
Over 30 AI models have been trained at the scale of GPT-4
By Robi Rahman, Lovis Heindrich, David Owen, and Luke Emberson
Data Insight
Jan. 15, 2025
Frontier open models may surpass 1e26 FLOP of training compute before 2026
By Luke Emberson
Data Insight
Jan. 8, 2025
Training compute growth is driven by larger clusters, longer training, and better hardware
By Luke Emberson and David Owen
Data Insight
Sep. 19, 2024
The power required to train frontier AI models is doubling annually
By Luke Emberson and Robi Rahman
Data Insight
Aug. 16, 2024
The length of time spent training notable models is growing
By Luke Emberson
Feedback
Have a question? Noticed something wrong? Let us know.
Luke Emberson
Luke Emberson is a researcher on Epoch AI's data team. He focuses on tracking and forecasting model capabilities through benchmarks and the Epoch Capabilities I