line / LINE-DistilBERT-Japanese

DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
43Updated last year

Alternatives and similar repositories for LINE-DistilBERT-Japanese:

Users that are interested in LINE-DistilBERT-Japanese are comparing it to the libraries listed below