deepspeedai / deepspeed-gpt-neoxLinks
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
☆23Updated 2 years ago
Alternatives and similar repositories for deepspeed-gpt-neox
Users that are interested in deepspeed-gpt-neox are comparing it to the libraries listed below
Sorting:
- Megatron LM 11B on Huggingface Transformers☆27Updated 4 years ago
- Implementation of stop sequencer for Huggingface Transformers☆16Updated 2 years ago
- Calculating Expected Time for training LLM.☆38Updated 2 years ago
- Anh - LAION's multilingual assistant datasets and models☆27Updated 2 years ago
- ☆19Updated 2 years ago
- data related codebase for polyglot project☆19Updated 2 years ago
- A minimal PyTorch Lightning OpenAI GPT w DeepSpeed Training!☆113Updated 2 years ago
- A PyTorch Implementation of the Luna: Linear Unified Nested Attention☆41Updated 4 years ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆81Updated 3 years ago
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AI☆56Updated 2 years ago
- MeCab model trained with OpenKorPos.☆23Updated 3 years ago
- Train 🤗transformers with DeepSpeed: ZeRO-2, ZeRO-3☆23Updated 4 years ago
- Implementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.☆32Updated 3 years ago
- Convenient Text-to-Text Training for Transformers☆19Updated 3 years ago
- ☆23Updated last year
- ☆30Updated 2 years ago
- **ARCHIVED** Filesystem interface to 🤗 Hub☆58Updated 2 years ago
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 3 years ago
- Implementation Google Meena for open domain conversation.☆29Updated 3 years ago
- ↔️ T5 Machine Translation from English to Korean☆18Updated 3 years ago
- 언어모델을 학습하기 위한 공개 한국어 instruction dataset들을 모아두었습니다.☆19Updated 2 years ago
- ☆11Updated 5 years ago
- Repo for "Smart Word Suggestions" (SWS) task and benchmark☆20Updated last year
- ☆20Updated 2 years ago
- Techniques used to run BLOOM at inference in parallel☆37Updated 2 years ago
- Beyond LM: How can language model go forward in the future?☆15Updated 2 years ago
- ☆87Updated 3 years ago
- GPT-jax based on the official huggingface library☆13Updated 4 years ago
- 🚀 Implementation of easy-to-use 3D parallelism based on Huggingface Transformers & Microsoft DeepSpeed☆31Updated 3 years ago
- Machine Generated Captions for Best Artworks☆22Updated 2 years ago