AI Supercomputers Documentation

Overview

Our AI Supercomputers Dataset is a collection of AI compute clusters made of hardware typically used for training large-scale AI models, with key details such as their performance, hardware type, location, and estimated cost and power draw. This dataset is useful for research about trends in the physical infrastructure used for training artificial intelligence.

This documentation describes which AI supercomputers are contained within the dataset, the information in its records (including data fields and definitions), and processes for adding new entries and auditing accuracy. It also includes a changelog and acknowledgements.

The dataset is accessible on our website as a visualization or table, and is available for download as a CSV file, refreshed daily. For a quick-start example of loading the data and working with it in your research, see this Google Colab demo notebook.

If you would like to ask any questions about the database, or suggest any systems that should be added or edited, feel free to contact us at data@epoch.ai.

If this dataset is useful for you, please cite it.

Use This Work

Epoch’s data is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons Attribution license.

Citation

Konstantin Pilz, Robi Rahman, James Sanders, Lennart Heim, ‘Trends in AI Supercomputers’. Published online at epoch.ai. Retrieved from: ‘https://epoch.ai/data/ai-supercomputers’ [online resource], accessed 2025-08-13

BibTeX citation

@misc{EpochAISupercomputers2025,
  title = {“Trends in AI Supercomputers”},
  author = {{Konstantin Pilz, Robi Rahman, James Sanders, Lennart Heim}},
  year = {2025},
  month = {04},
  url = {https://epoch.ai/data/ai-supercomputers},
  note = {Accessed: 2025-08-13"}
}}