iwiwi / epochraft-hf-fsdp
Example of using Epochraft to train HuggingFace transformers models with PyTorch FSDP
☆12Updated 9 months ago
Related projects ⓘ
Alternatives and complementary repositories for epochraft-hf-fsdp
- Checkpointable dataset utilities for foundation model training☆32Updated 9 months ago
- Support Continual pre-training & Instruction Tuning forked from llama-recipes☆32Updated 9 months ago
- Mamba training library developed by kotoba technologies☆68Updated 9 months ago
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆21Updated 7 months ago
- Ongoing Research Project for continaual pre-training LLM(dense mode)☆32Updated last week
- Flexible evaluation tool for language models☆36Updated this week
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆118Updated 3 weeks ago
- ☆33Updated 3 months ago
- ☆52Updated 5 months ago
- 日本語マルチタスク言語理解ベンチマーク Japanese Massive Multitask Language Understanding Benchmark☆25Updated 8 months ago
- ☆24Updated 8 months ago
- ☆71Updated 6 months ago
- Swallowプロジェクト 大規模言語モデル 評価スクリプト☆10Updated 4 months ago
- ☆102Updated this week
- ☆21Updated last year
- ☆14Updated 7 months ago
- ☆13Updated 2 months ago
- Japanese LLaMa experiment☆52Updated 8 months ago
- ☆13Updated 5 months ago
- Ongoing research training Mixture of Expert models.☆19Updated 2 months ago
- ☆44Updated last year
- A large-scale RWKV v6 inference with FLA . Capable of inference by combining multiple states(Pseudo MoE). Easy to deploy on docker. Suppo…☆16Updated last week
- A toolkit for scaling law research ⚖☆43Updated 8 months ago
- ☆24Updated 3 weeks ago
- Simple and efficient pytorch-native transformer training and inference (batched)☆61Updated 7 months ago
- ☆25Updated 5 months ago
- [ICLR 2023] "Sparse MoE as the New Dropout: Scaling Dense and Self-Slimmable Transformers" by Tianlong Chen*, Zhenyu Zhang*, Ajay Jaiswal…☆44Updated last year
- ☆50Updated 6 months ago
- Long Context Extension and Generalization in LLMs☆39Updated 2 months ago
- The source code of our work "Prepacking: A Simple Method for Fast Prefilling and Increased Throughput in Large Language Models"☆56Updated last month