okoge-kaz / moe-recipes
Ongoing research training Mixture of Expert models.
☆18Updated this week
Related projects: ⓘ
- Mixtral-based Ja-En (En-Ja) Translation model☆15Updated 8 months ago
- Japanese LLaMa experiment☆50Updated 6 months ago
- A Slack Bot for summarizing arXiv papers, powered by OpenAI LLMs.☆66Updated last year
- ☆11Updated 3 months ago
- ☆21Updated 9 months ago
- ☆14Updated 2 weeks ago
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆21Updated 4 months ago
- ☆45Updated 3 months ago
- ☆19Updated last year
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆113Updated last month
- ☆40Updated 7 months ago
- Mamba training library developed by kotoba technologies☆63Updated 7 months ago
- Project of llm evaluation to Japanese tasks☆67Updated last week
- Support Continual pre-training & Instruction Tuning forked from llama-recipes☆31Updated 7 months ago
- ☆17Updated 8 months ago
- ☆32Updated last month
- ☆14Updated 6 months ago
- 【2024年版】BERTによるテキスト分類☆22Updated 2 months ago
- ☆14Updated 5 months ago
- ☆11Updated 3 weeks ago
- Checkpointable dataset utilities for foundation model training☆31Updated 7 months ago
- ☆50Updated last year
- ☆81Updated last year
- Japanese-BPEEncoder☆39Updated 3 years ago
- Code for COLING 2020 Paper☆13Updated 2 weeks ago
- ☆24Updated 2 years ago
- RealPersonaChat: A Realistic Persona Chat Corpus with Interlocutors' Own Personalities☆44Updated 6 months ago
- ☆47Updated 5 months ago
- A library for semantic similarity search☆23Updated 2 weeks ago
- Japanese translation of Open Source AI Definition☆14Updated 3 weeks ago