line / LINE-DistilBERT-JapaneseView on GitHub
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
46Mar 22, 2023Updated 2 years ago

Alternatives and similar repositories for LINE-DistilBERT-Japanese

Users that are interested in LINE-DistilBERT-Japanese are comparing it to the libraries listed below

Sorting:

Are these results useful?