allegro / HerBERT

HerBERT is a BERT-based Language Model trained on Polish Corpora using only MLM objective with dynamic masking of whole words.
65Updated 2 years ago

Related projects

Alternatives and complementary repositories for HerBERT