speed1313 / jax-llmLinks
JAX implementation of Large Language Models. You can train GPT-2-like model with 青空文庫 (aozora bunko-clean dataset) or any other text dataset.
☆13Updated last year
Alternatives and similar repositories for jax-llm
Users that are interested in jax-llm are comparing it to the libraries listed below
Sorting:
- ☆27Updated last year
- JMultiWOZ: A Large-Scale Japanese Multi-Domain Task-Oriented Dialogue Dataset, LREC-COLING 2024☆25Updated last year
- Easily turn large English text datasets into Japanese text datasets using open LLMs.☆25Updated 11 months ago
- Swallowプロジェクト 事後学習済み大規模言語モデル 評価フレームワーク☆24Updated 2 months ago
- Preferred Generation Benchmark☆86Updated 2 months ago
- Japanese-BPEEncoder☆41Updated 4 years ago
- ☆33Updated last year
- The evaluation scripts of JMTEB (Japanese Massive Text Embedding Benchmark)☆81Updated 2 weeks ago
- ☆34Updated 6 years ago
- ☆51Updated 2 years ago
- Mixtral-based Ja-En (En-Ja) Translation model☆20Updated last year
- ☆88Updated 2 years ago
- ☆19Updated last year
- 【2024年版】BERTによるテキスト分類☆30Updated last year
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆124Updated last month
- 敬語変換タスクにおける評価用データセット☆21Updated 3 years ago
- ☆25Updated 7 months ago
- Swallowプロジェクト 大規模言語モデル 評価スクリプト☆23Updated 3 months ago
- NLP 100 Exercise 2025☆38Updated 9 months ago
- ☆49Updated last year
- Japanese instruction data (日本語指示データ)☆24Updated 2 years ago
- 🛥 Vaporetto is a fast and lightweight pointwise prediction based tokenizer. This is a Python wrapper for Vaporetto.☆21Updated 7 months ago
- Training and evaluation scripts for JGLUE, a Japanese language understanding benchmark☆18Updated this week
- Exploring Japanese SimCSE☆68Updated 2 years ago
- ☆44Updated 4 months ago
- ☆55Updated last year
- Show notes for https://anchor.fm/yoheikikuta.☆15Updated 3 years ago
- 日本語マルチタスク言語理解ベンチマーク Japanese Massive Multitask Language Understanding Benchmark☆38Updated 3 months ago
- DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.☆46Updated 2 years ago
- ☆50Updated last year