Oneflow-Inc / libaiLinks
LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
☆408Updated 3 weeks ago
Alternatives and similar repositories for libai
Users that are interested in libai are comparing it to the libraries listed below
Sorting:
- ☆220Updated 2 years ago
- Models and examples built with OneFlow☆98Updated 10 months ago
- ☆79Updated last year
- Best practice for training LLaMA models in Megatron-LM☆660Updated last year
- optimized BERT transformer inference on NVIDIA GPU. https://arxiv.org/abs/2210.03052☆475Updated last year
- Efficient Training (including pre-training and fine-tuning) for Big Models☆604Updated 2 months ago
- Easy Parallel Library (EPL) is a general and efficient deep learning framework for distributed model training.☆267Updated 2 years ago
- Running BERT without Padding☆475Updated 3 years ago
- OneFlow models for benchmarking.☆104Updated last year
- Transformer related optimization, including BERT, GPT☆39Updated 2 years ago
- FlagScale is a large model toolkit based on open-sourced projects.☆346Updated last week
- DeepLearning Framework Performance Profiling Toolkit☆287Updated 3 years ago
- Transformer related optimization, including BERT, GPT☆59Updated last year
- oneflow documentation☆69Updated last year
- ☆128Updated 8 months ago
- Simple Dynamic Batching Inference☆145Updated 3 years ago
- Efficient Inference for Big Models☆586Updated 2 years ago
- Model Compression for Big Models☆164Updated 2 years ago
- Collaborative Training of Large Language Models in an Efficient Way☆416Updated 11 months ago
- ☆614Updated last year
- ☆172Updated this week
- USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference☆549Updated last month
- ☆54Updated last week
- The road to hack SysML and become an system expert☆498Updated 11 months ago
- A flexible and efficient training framework for large-scale alignment tasks☆415Updated this week
- Easy and Efficient Transformer : Scalable Inference Solution For Large NLP model☆263Updated 8 months ago
- InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencie…☆404Updated this week
- Examples of training models with hybrid parallelism using ColossalAI☆340Updated 2 years ago
- Large-scale model inference.☆632Updated last year
- Ascend PyTorch adapter (torch_npu). Mirror of https://gitee.com/ascend/pytorch☆410Updated this week