Pythia: A Suite for Analyzing Large Language Models Across Training and Scaling

Open Access
Authors
  • S. Biderman
  • H. Schoelkopf
  • Q. Anthony
  • H. Bradley
  • K. O'Brien
  • E. Hallahan
  • M.A. Khan
  • S. Purohit
  • U.S. Sai Prashanth
  • E. Raff
  • A. Skowron
  • L. Sutawika
  • O. van der Wal
Publication date 2023
Journal Proceedings of Machine Learning Research
Event 40th International Conference on Machine Learning, ICML 2023
Volume | Issue number 202
Pages (from-to) 2397-2430
Number of pages 34
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract

How do large language models (LLMs) develop and evolve over the course of training? How do these patterns change as models scale? To answer these questions, we introduce Pythia, a suite of 16 LLMs all trained on public data seen in the exact same order and ranging in size from 70M to 12B parameters. We provide public access to 154 checkpoints for each one of the 16 models, alongside tools to download and reconstruct their exact training dataloaders for further study. We intend Pythia to facilitate research in many areas, and we present several case studies including novel results in memorization, term frequency effects on few-shot performance, and reducing gender bias. We demonstrate that this highly controlled setup can be used to yield novel insights toward LLMs and their training dynamics. Trained models, analysis code, training code, and training data can be found at https://github.com/EleutherAI/pythia.

Document type Article
Note Proceedings of the 40th International Conference on International Conference on Machine Learning, 23-29 July 2023, Honolulu, Hawaii, USA
Language English
Published at https://proceedings.mlr.press/v202/biderman23a.html
Other links https://github.com/EleutherAI/pythia https://www.scopus.com/pages/publications/85174386797
Downloads
biderman23a (Final published version)
Permalink to this page
Back