wandb / llm-leaderboard
Project of llm evaluation to Japanese tasks
☆67Updated last week
Related projects: ⓘ
- ☆93Updated this week
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆21Updated 4 months ago
- ☆45Updated 3 months ago
- Japanese LLaMa experiment☆50Updated 6 months ago
- LLM構築用の日本語チャットデータセット☆76Updated 7 months ago
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆113Updated last month
- A framework for few-shot evaluation of autoregressive language models.☆144Updated last week
- ☆40Updated 7 months ago
- Japanese instruction data (日本語指示データ)☆22Updated last year
- ☆32Updated last month
- Support Continual pre-training & Instruction Tuning forked from llama-recipes☆31Updated 7 months ago
- JaQuAD: Japanese Question Answering Dataset for Machine Reading Comprehension (2022, Skelter Labs)☆106Updated 2 years ago
- ☆21Updated 9 months ago
- 🤖 A collection of AI agents includes research papers, blogs, and products focused on developing autonomous systems.☆35Updated 3 months ago
- Codes to pre-train Japanese T5 models☆40Updated 3 years ago
- ☆14Updated last week
- ☆38Updated 5 months ago
- The evaluation scripts of JMTEB (Japanese Massive Text Embedding Benchmark)☆25Updated this week
- 日本語マルチタスク言語理解ベンチマーク Japanese Massive Multitask Language Understanding Benchmark☆25Updated 6 months ago
- ☆40Updated last year
- Mixtral-based Ja-En (En-Ja) Translation model☆15Updated 8 months ago
- Flexible evaluation tool for language models☆27Updated this week
- ☆81Updated last year
- Ongoing Research Project for continaual pre-training LLM(dense mode)☆24Updated this week
- COMET-ATOMIC ja☆28Updated 6 months ago
- ☆47Updated 5 months ago
- Exploring Japanese SimCSE☆60Updated 10 months ago
- ☆11Updated 3 weeks ago
- JQaRA: Japanese Question Answering with Retrieval Augmentation - 検索拡張(RAG)評価のための日本語Q&Aデータセット☆16Updated last week
- Utility scripts for preprocessing Wikipedia texts for NLP☆73Updated 5 months ago