deepspeedai / deepspeed-gpt-neoxLinks
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
☆20Updated 2 years ago
Alternatives and similar repositories for deepspeed-gpt-neox
Users that are interested in deepspeed-gpt-neox are comparing it to the libraries listed below
Sorting:
- Megatron LM 11B on Huggingface Transformers☆27Updated 4 years ago
- Implementation of stop sequencer for Huggingface Transformers☆16Updated 2 years ago
- A minimal PyTorch Lightning OpenAI GPT w DeepSpeed Training!☆113Updated 2 years ago
- ↔️ T5 Machine Translation from English to Korean☆18Updated 3 years ago
- ☆19Updated 3 years ago
- A PyTorch Implementation of the Luna: Linear Unified Nested Attention☆41Updated 4 years ago
- Train 🤗transformers with DeepSpeed: ZeRO-2, ZeRO-3☆23Updated 4 years ago
- MeCab model trained with OpenKorPos.☆23Updated 3 years ago
- BERT, RoBERTa fine-tuning over SQuAD Dataset using pytorch-lightning⚡️, 🤗-transformers & 🤗-nlp.☆36Updated 2 years ago
- ☆20Updated 2 years ago
- ☆23Updated 2 years ago
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AI☆56Updated 2 years ago
- baikal.ai's pre-trained BERT models: descriptions and sample codes☆12Updated 4 years ago
- 언어모델을 학습하기 위한 공개 한국어 instruction dataset들을 모아두었습니다.☆19Updated 2 years ago
- This is project for korean auto spacing☆12Updated 5 years ago
- data related codebase for polyglot project☆18Updated 2 years ago
- Beyond LM: How can language model go forward in the future?☆15Updated 2 years ago
- ☆24Updated 2 years ago
- ☆30Updated 2 years ago
- 매주 목요일, 20:00 모임☆16Updated 5 years ago
- Deploy KoGPT with Triton Inference Server☆14Updated 2 years ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆81Updated 3 years ago