swallow-llm / swallow-evaluationLinks
Swallowプロジェクト 大規模言語モデル 評価スクリプト
☆22Updated 2 months ago
Alternatives and similar repositories for swallow-evaluation
Users that are interested in swallow-evaluation are comparing it to the libraries listed below
Sorting:
- ☆140Updated last week
- ☆62Updated last year
- 日本語マルチタスク言語理解ベンチマーク Japanese Massive Multitask Language Understanding Benchmark☆38Updated 2 months ago
- ☆24Updated last year
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆124Updated 3 weeks ago
- ☆27Updated last year
- ☆33Updated last year
- Japanese LLaMa experiment☆54Updated last month
- Swallowプロジェクト 事後学習済み大規模言語モデル 評価フレームワーク☆23Updated last month
- Flexible evaluation tool for language models☆54Updated last week
- A framework for few-shot evaluation of autoregressive language models.☆154Updated last year
- Preferred Generation Benchmark☆85Updated last month
- JMultiWOZ: A Large-Scale Japanese Multi-Domain Task-Oriented Dialogue Dataset, LREC-COLING 2024☆25Updated last year
- RealPersonaChat: A Realistic Persona Chat Corpus with Interlocutors' Own Personalities☆63Updated last year
- ☆55Updated last year
- LLM構築用の日本語チャットデータセット☆86Updated last year
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆22Updated last year
- ☆88Updated 2 years ago
- Exploring Japanese SimCSE☆69Updated 2 years ago
- JMED-LLM: Japanese Medical Evaluation Dataset for Large Language Models☆53Updated last year
- ☆29Updated 7 months ago
- ☆16Updated last year
- ☆38Updated 7 months ago
- Ongoing Research Project for continaual pre-training LLM(dense mode)☆44Updated 9 months ago
- Mixtral-based Ja-En (En-Ja) Translation model☆20Updated 11 months ago
- ☆43Updated last year
- ☆50Updated last year
- The evaluation scripts of JMTEB (Japanese Massive Text Embedding Benchmark)☆77Updated 2 weeks ago
- Project of llm evaluation to Japanese tasks☆90Updated last month
- Official implementation of "TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models"☆119Updated 2 months ago