okoge-kaz / moe-recipes
Ongoing research training Mixture of Expert models.
☆19Updated 2 months ago
Related projects ⓘ
Alternatives and complementary repositories for moe-recipes
- ☆14Updated 2 months ago
- ☆21Updated last year
- Japanese LLaMa experiment☆52Updated 8 months ago
- Unofficial entropix impl for Gemma2 and Llama and Qwen2 and Mistral☆15Updated last month
- Swallowプロジェクト 大規模言語モデル 評価スクリプト☆10Updated 4 months ago
- ☆22Updated 11 months ago
- Mamba training library developed by kotoba technologies☆68Updated 9 months ago
- ☆51Updated 5 months ago
- Mixtral-based Ja-En (En-Ja) Translation model☆16Updated 10 months ago
- ☆15Updated 8 months ago
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆21Updated 6 months ago
- JMultiWOZ: A Large-Scale Japanese Multi-Domain Task-Oriented Dialogue Dataset☆22Updated 7 months ago
- Project of llm evaluation to Japanese tasks☆77Updated this week
- Support Continual pre-training & Instruction Tuning forked from llama-recipes☆32Updated 9 months ago
- 【2024年版】BERTによるテキスト 分類☆24Updated 4 months ago
- A Slack Bot for summarizing arXiv papers, powered by OpenAI LLMs.☆68Updated last year
- ☆13Updated 2 months ago
- ☆82Updated last year
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆118Updated 3 weeks ago
- ☆48Updated 7 months ago
- ☆31Updated 3 months ago
- Example of using Epochraft to train HuggingFace transformers models with PyTorch FSDP☆12Updated 9 months ago
- ☆24Updated 2 weeks ago
- ☆12Updated 5 months ago
- ☆33Updated 3 months ago
- Japanese-BPEEncoder☆39Updated 3 years ago
- ☆41Updated 9 months ago
- 0️⃣1️⃣🤗 BitNet-Transformers: Huggingface Transformers Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" i…☆95Updated 8 months ago
- ☆14Updated 7 months ago
- ☆50Updated last year