Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
☆1,008Jul 29, 2024Updated last year
Alternatives and similar repositories for bigscience
Users that are interested in bigscience are comparing it to the libraries listed below
Sorting:
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆1,435Mar 20, 2024Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆2,229Aug 14, 2025Updated 6 months ago
- Repo for external large-scale work☆6,543Apr 27, 2024Updated last year
- Ongoing research training transformer models at scale☆15,242Updated this week
- Toolkit for creating, sharing and using natural language prompts.☆2,996Oct 23, 2023Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,392Feb 3, 2026Updated 3 weeks ago
- Code used for sourcing and cleaning the BigScience ROOTS corpus☆318Mar 20, 2023Updated 2 years ago
- A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)☆4,741Jan 8, 2024Updated 2 years ago
- Reproduce results and replicate training fo T0 (Multitask Prompted Training Enables Zero-Shot Task Generalization)☆466Nov 5, 2022Updated 3 years ago
- Crosslingual Generalization through Multitask Finetuning☆537Sep 22, 2024Updated last year
- PyTorch extensions for high performance and large scale training.☆3,400Apr 26, 2025Updated 10 months ago
- ☆2,946Jan 15, 2026Updated last month
- Transformer related optimization, including BERT, GPT☆6,394Mar 27, 2024Updated last year
- Accessible large language models via k-bit quantization for PyTorch.☆7,997Updated this week
- The hub for EleutherAI's work on interpretability and learning dynamics☆2,740Nov 15, 2025Updated 3 months ago
- Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models☆3,207Jul 19, 2024Updated last year
- Example models using DeepSpeed☆6,785Feb 7, 2026Updated 2 weeks ago
- Fast Inference Solutions for BLOOM☆566Oct 9, 2024Updated last year
- ☆1,560Feb 20, 2026Updated last week
- Human preference data for "Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback"☆1,816Jun 17, 2025Updated 8 months ago
- Training and serving large-scale neural networks with auto parallelization.☆3,183Dec 9, 2023Updated 2 years ago
- GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)☆7,672Jul 25, 2023Updated 2 years ago
- The RedPajama-Data repository contains code for preparing large datasets for training large language models.☆4,923Dec 7, 2024Updated last year
- Foundation Architecture for (M)LLMs☆3,135Apr 11, 2024Updated last year
- maximal update parametrization (µP)☆1,686Jul 17, 2024Updated last year
- Fast and memory-efficient exact attention☆22,361Updated this week
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,513Updated this week
- Train transformer language models with reinforcement learning.☆17,460Updated this week
- A framework for few-shot evaluation of language models.☆11,478Feb 15, 2026Updated last week
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆10,353Feb 20, 2026Updated last week
- Minimalistic large language model 3D-parallelism training☆2,569Feb 19, 2026Updated last week
- OSLO: Open Source framework for Large-scale model Optimization☆309Aug 25, 2022Updated 3 years ago
- MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.☆2,095Jun 30, 2025Updated 7 months ago
- ☆1,257Jul 30, 2024Updated last year
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆22,033Jan 23, 2026Updated last month
- Expanding natural instructions☆1,035Dec 11, 2023Updated 2 years ago
- Scaling Data-Constrained Language Models☆340Jun 28, 2025Updated 8 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆41,648Updated this week
- A Unified Library for Parameter-Efficient and Modular Transfer Learning☆2,801Oct 12, 2025Updated 4 months ago