allegro / HerBERTLinks
HerBERT is a BERT-based Language Model trained on Polish Corpora using only MLM objective with dynamic masking of whole words.
☆67Updated 3 years ago
Alternatives and similar repositories for HerBERT
Users that are interested in HerBERT are comparing it to the libraries listed below
Sorting:
- RoBERTa models for Polish☆87Updated 3 years ago
- A curated list of resources dedicated to Natural Language Processing (NLP) in polish. Models, tools, datasets.☆300Updated 3 years ago
- Pre-trained models and language resources for Natural Language Processing in Polish