speed1313 / jax-llm
JAX implementation of Large Language Models. You can train GPT-2-like model with 青空文庫 (aozora bunko-clean dataset) or any other text dataset.
☆12Updated 7 months ago
Alternatives and similar repositories for jax-llm:
Users that are interested in jax-llm are comparing it to the libraries listed below
- JMultiWOZ: A Large-Scale Japanese Multi-Domain Task-Oriented Dialogue Dataset, LREC-COLING 2024☆23Updated last year
- ☆26Updated 4 months ago
- Easily turn large English text datasets into Japanese text datasets using open LLMs.☆20Updated 2 months ago
- Japanese-BPEEncoder☆41Updated 3 years ago
- Swallowプロジェクト 大規模言語モデル 評価スクリプト☆16Updated last week
- ☆11Updated 6 months ago
- ☆11Updated 5 years ago
- 敬語変換タスクにおける評価用データセット☆21Updated 2 years ago
- 『アルゴリズムとデータ構造』(大槻兼資著、秋葉拓哉監修; 講談社)という本を Python で書き直しているプロジェクトです。☆19Updated 2 years ago
- ☆20Updated 2 months ago
- Training and evaluation scripts for JGLUE, a Japanese language understanding benchmark☆17Updated 3 weeks ago
- ☆22Updated last year
- Show notes for https://anchor.fm/yoheikikuta.☆15Updated 2 years ago
- 【2024年版】BERTによるテキスト分類☆29Updated 8 months ago
- ☆50Updated 3 years ago
- Preferred Generation Benchmark☆78Updated this week
- ☆83Updated last year
- Japanese instruction data (日本語指示データ)☆22Updated last year
- Viterbi-based accelerated tokenizer (Python wrapper)☆41Updated 6 months ago
- ☆50Updated last year
- ☆18Updated 10 months ago
- DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.☆44Updated 2 years ago
- Code for COLING 2020 Paper☆13Updated 3 weeks ago
- The evaluation scripts of JMTEB (Japanese Massive Text Embedding Benchmark)☆52Updated 3 weeks ago
- [wip] Lightweight Automatic Differentiation & DeepLearning Framework implemented in pure Julia.☆31Updated last year
- This repository has implementations of data augmentation for NLP for Japanese.☆64Updated 2 years ago
- ☆33Updated 7 months ago
- ☆34Updated 5 years ago
- ディープラーニングによる自然言語処理(共立出版)のサポートページです☆10Updated last year
- A toy hyperparameter optimization framework intended for understanding Optuna's internal design.☆83Updated 3 years ago