allegro / HerBERT
HerBERT is a BERT-based Language Model trained on Polish Corpora using only MLM objective with dynamic masking of whole words.
☆67Updated 3 years ago
Alternatives and similar repositories for HerBERT:
Users that are interested in HerBERT are comparing it to the libraries listed below
- RoBERTa models for Polish☆87Updated 3 years ago
- A curated list of resources dedicated to Natural Language Processing (NLP) in polish. Models, tools, datasets.☆298Updated 3 years ago
- Resources for doing NLP in Polish☆47Updated 5 years ago
- Pre-trained models and language resources for Natural Language Processing in Polish☆337Updated 10 months ago
- ☆50Updated 2 years ago
- Fine-tuning scripts for evaluating transformer-based models on KLEJ benchmark.