line / LINE-DistilBERT-JapaneseLinks
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
☆44Updated 2 years ago
Alternatives and similar repositories for LINE-DistilBERT-Japanese
Users that are interested in LINE-DistilBERT-Japanese are comparing it to the libraries listed below
Sorting:
- Mecab + NEologd + Docker + Python3☆35Updated 3 years ago
- JMultiWOZ: A Large-Scale Japanese Multi-Domain Task-Oriented Dialogue Dataset, LREC-COLING 2024☆24Updated last year
- GPTがYouTuberをやります☆62Updated last year
- ☆31Updated 8 months ago