line / LINE-DistilBERT-Japanese
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
☆44Updated 2 years ago
Alternatives and similar repositories for LINE-DistilBERT-Japanese:
Users that are interested in LINE-DistilBERT-Japanese are comparing it to the libraries listed below
- Mecab + NEologd + Docker + Python3☆35Updated 2 years ago
- AI Talksoft of Tsukuyomichan☆40Updated last year
- A tool to get the katakana reading of an alphabetical string.☆32Updated 3 years ago
- GPTがYouTuberをやります☆62Updated last year
- ☆50Updated last year
- 【2024年版】BERTによるテキスト分類☆29Updated 9 months ago
- ☆31Updated 6 months ago
- ☆19Updated 2 years ago
- 解説動画生成ツール☆109Updated 4 months ago
- ☆83Updated last year
- Test script of LLMs☆55Updated last year
- ☆39Updated 7 months ago
- Japanese synonym library