DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
☆171Sep 26, 2025Updated 6 months ago
Alternatives and similar repositories for DeeperSpeed
Users that are interested in DeeperSpeed are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,400Feb 3, 2026Updated last month
- ☆78Dec 7, 2023Updated 2 years ago
- ☆95Jul 16, 2022Updated 3 years ago
- Pile Deduplication Code☆18May 15, 2023Updated 2 years ago
- RWKV model implementation☆37Jul 15, 2023Updated 2 years ago
- NordVPN Special Discount Offer • AdSave on top-rated NordVPN 1 or 2-year plans with secure browsing, privacy protection, and support for for all major platforms.
- ☆24Dec 11, 2024Updated last year
- Keeping language models honest by directly eliciting knowledge encoded in their activations.☆217Mar 16, 2026Updated last week
- Web app for demoing the EAI models☆16May 18, 2022Updated 3 years ago
- ☆64Apr 9, 2024Updated last year
- OSLO: Open Source for Large-scale Optimization☆175Sep 9, 2023Updated 2 years ago
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AI☆56Sep 1, 2023Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.☆21Nov 28, 2022Updated 3 years ago
- Using queues, tqdm-multiprocess supports multiple worker processes, each with multiple tqdm progress bars, displaying them cleanly throug…☆43Jan 6, 2021Updated 5 years ago
- ☆39Jul 25, 2024Updated last year
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- Downloads 2020 English Wikipedia articles as plaintext☆27Mar 25, 2023Updated 3 years ago
- Efficiently computing & storing token n-grams from large corpora☆27Oct 6, 2024Updated last year
- Lightweight piece tokenization library☆12Apr 15, 2024Updated last year
- Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amount…☆53Oct 22, 2023Updated 2 years ago
- Vocabulary Parallelism☆25Mar 10, 2025Updated last year
- ☆163Mar 5, 2021Updated 5 years ago
- Model parallel transformers in JAX and Haiku☆6,365Jan 21, 2023Updated 3 years ago
- Download, parse, and filter data from Court Listener, part of the FreeLaw projects. Data-ready for The-Pile.☆15Jun 3, 2023Updated 2 years ago
- Code for the paper-"Mirostat: A Perplexity-Controlled Neural Text Decoding Algorithm" (https://arxiv.org/abs/2007.14966).☆61Feb 7, 2022Updated 4 years ago
- End-to-end encrypted email - Proton Mail • AdSpecial offer: 40% Off Yearly / 80% Off First Month. All Proton services are open source and independently audited for security.
- Full knowledge and control of the train state.☆19Sep 23, 2020Updated 5 years ago
- Python Research Framework☆107Nov 3, 2022Updated 3 years ago
- PSTensor provides a way to hack the memory management of tensors in TensorFlow and PyTorch by defining your own C++ Tensor Class.☆10Feb 10, 2022Updated 4 years ago
- Implementation of Influence Function approximations for differently sized ML models, using PyTorch☆16Sep 15, 2023Updated 2 years ago
- Implementation of Token Shift GPT - An autoregressive model that solely relies on shifting the sequence space for mixing☆49Jan 27, 2022Updated 4 years ago
- Demonstration that finetuning RoPE model on larger sequences than the pre-trained model adapts the model context limit☆63Jun 21, 2023Updated 2 years ago
- ☆2,956Mar 9, 2026Updated 2 weeks ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Jul 26, 2021Updated 4 years ago
- Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch☆879Oct 30, 2023Updated 2 years ago
- Virtual machines for every use case on DigitalOcean • AdGet dependable uptime with 99.99% SLA, simple security tools, and predictable monthly pricing with DigitalOcean's virtual machines, called Droplets.
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆1,438Mar 20, 2024Updated 2 years ago
- Script for downloading GitHub.☆13Sep 24, 2020Updated 5 years ago
- OSLO: Open Source framework for Large-scale model Optimization☆309Aug 25, 2022Updated 3 years ago
- Open-AI's DALL-E for large scale training in mesh-tensorflow.☆431Feb 12, 2022Updated 4 years ago
- ☆131Jun 9, 2022Updated 3 years ago
- The hub for EleutherAI's work on interpretability and learning dynamics☆2,751Nov 15, 2025Updated 4 months ago
- One stop shop for all things carp☆59Sep 9, 2022Updated 3 years ago