line / LINE-DistilBERT-JapaneseLinks
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
☆45Updated 2 years ago
Alternatives and similar repositories for LINE-DistilBERT-Japanese
Users that are interested in LINE-DistilBERT-Japanese are comparing it to the libraries listed below
Sorting:
- 日本語における不適切表現を収集します。自然言語処理の時のデータクリーニング用等に使えると思います。