allegro / HerBERT

HerBERT is a BERT-based Language Model trained on Polish Corpora using only MLM objective with dynamic masking of whole words.
66Updated 2 years ago

Alternatives and similar repositories for HerBERT:

Users that are interested in HerBERT are comparing it to the libraries listed below