Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
☆1,010Jul 29, 2024Updated last year
Alternatives and similar repositories for bigscience
Users that are interested in bigscience are comparing it to the libraries listed below
Sorting:
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆1,437Mar 20, 2024Updated 2 years ago
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆2,233Aug 14, 2025Updated 7 months ago
- Ongoing research training transformer models at scale☆15,647Updated this week
- Toolkit for creating, sharing and using natural language prompts.☆3,006Oct 23, 2023Updated 2 years ago
- Repo for external large-scale work☆6,542Apr 27, 2024Updated last year
- Crosslingual Generalization through Multitask Finetuning☆537Sep 22, 2024Updated last year
- Code used for sourcing and cleaning the BigScience ROOTS corpus☆319Mar 20, 2023Updated 3 years ago
- A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)☆4,739Jan 8, 2024Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,399Feb 3, 2026Updated last month
- PyTorch extensions for high performance and large scale training.☆3,403Apr 26, 2025Updated 10 months ago
- Reproduce results and replicate training fo T0 (Multitask Prompted Training Enables Zero-Shot Task Generalization)☆465Nov 5, 2022Updated 3 years ago
- Fast Inference Solutions for BLOOM☆566Oct 9, 2024Updated last year
- ☆2,952Mar 9, 2026Updated last week
- Code and Data for Evaluation WG☆42May 4, 2022Updated 3 years ago
- Example models using DeepSpeed☆6,805Mar 4, 2026Updated 2 weeks ago
- The RedPajama-Data repository contains code for preparing large datasets for training large language models.☆4,929Dec 7, 2024Updated last year
- Transformer related optimization, including BERT, GPT☆6,397Mar 27, 2024Updated last year
- Accessible large language models via k-bit quantization for PyTorch.☆8,052Updated this week
- maximal update parametrization (µP)☆1,690Jul 17, 2024Updated last year
- GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)☆7,664Jul 25, 2023Updated 2 years ago
- The hub for EleutherAI's work on interpretability and learning dynamics☆2,745Nov 15, 2025Updated 4 months ago
- Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models☆3,215Jul 19, 2024Updated last year
- Training and serving large-scale neural networks with auto parallelization.☆3,187Dec 9, 2023Updated 2 years ago
- ☆1,560Updated this week
- Human preference data for "Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback"☆1,827Jun 17, 2025Updated 9 months ago
- Foundation Architecture for (M)LLMs☆3,135Apr 11, 2024Updated last year
- Train transformer language models with reinforcement learning.☆17,697Updated this week
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,563Updated this week
- Fast and memory-efficient exact attention☆22,832Updated this week
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆10,373Updated this week
- A framework for few-shot evaluation of language models.☆11,704Mar 5, 2026Updated 2 weeks ago
- Minimalistic large language model 3D-parallelism training☆2,609Feb 19, 2026Updated last month
- ☆1,262Jul 30, 2024Updated last year
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆41,807Updated this week
- MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.☆2,101Jun 30, 2025Updated 8 months ago
- Scaling Data-Constrained Language Models☆342Jun 28, 2025Updated 8 months ago
- Expanding natural instructions☆1,036Dec 11, 2023Updated 2 years ago
- Tools for managing datasets for governance and training.☆90Updated this week
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆32,191Sep 30, 2025Updated 5 months ago