okoge-kaz / moe-recipesLinks
Ongoing research training Mixture of Expert models.
☆19Updated 8 months ago
Alternatives and similar repositories for moe-recipes
Users that are interested in moe-recipes are comparing it to the libraries listed below
Sorting:
- Swallowプロジェクト 大規模言語モデル 評価スクリプト☆17Updated last month
- ☆14Updated 9 months ago
- ☆60Updated 11 months ago
- ☆42Updated last year
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆21Updated last year
- Easily turn large English text datasets into Japanese text datasets using open LLMs.☆20Updated 4 months ago
- ☆22Updated last year
- Japanese LLaMa experiment☆53Updated 6 months ago
- Mixtral-based Ja-En (En-Ja) Translation model☆19Updated 5 months ago
- 日本語マルチタスク言語理解ベンチマーク Japanese Massive Multitask Language Understanding Benchmark☆36Updated 5 months ago
- Project of llm evaluation to Japanese tasks☆83Updated last week
- Support Continual pre-training & Instruction Tuning forked from llama-recipes☆32Updated last year
- Official implementation of "TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models"☆106Updated 4 months ago
- ☆40Updated 3 months ago
- Mamba training library developed by kotoba technologies☆70Updated last year
- ☆47Updated 5 months ago
- ☆33Updated 10 months ago
- ☆24Updated last year
- Example of using Epochraft to train HuggingFace transformers models with PyTorch FSDP☆11Updated last year
- Unofficial entropix impl for Gemma2 and Llama and Qwen2 and Mistral☆17Updated 4 months ago
- ☆26Updated 7 months ago
- ☆16Updated last year
- Ongoing Research Project for continaual pre-training LLM(dense mode)☆42Updated 3 months ago
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆122Updated 2 weeks ago
- ☆14Updated last year
- Preferred Generation Benchmark☆82Updated 2 weeks ago
- ☆83Updated last year
- ☆13Updated 8 months ago
- ☆16Updated 6 months ago
- ☆49Updated last year