allegro / HerBERTLinks
HerBERT is a BERT-based Language Model trained on Polish Corpora using only MLM objective with dynamic masking of whole words.
☆70Updated 3 years ago
Alternatives and similar repositories for HerBERT
Users that are interested in HerBERT are comparing it to the libraries listed below
Sorting:
- A curated list of resources dedicated to Natural Language Processing (NLP) in polish. Models, tools, datasets.☆306Updated 4 years ago
- Pre-trained models and language resources for Natural Language Processing in Polish☆364Updated last year
- RoBERTa models for Polish☆89Updated 3 years ago
- ☆50Updated 3 years ago
- Resources for doing NLP in Polish☆48Updated 6 years ago
- Polish BERT☆72Updated 5 years ago
- Polish Dataset of Banned Harmful and Offensive Content from Wykop.pl web service☆58Updated 11 months ago
- Polish RoBERTA model trained on Polish literature, Wikipedia, and Oscar. The major assumption is that quality text will give a good mode…☆35Updated 4 years ago
- Python port of Stempel, an algorithmic stemmer for Polish language.☆39Updated last year
- Polish morphological tagger.☆43Updated 2 years ago
- Fine-tuning scripts for evaluating transformer-based models on KLEJ benchmark.☆26Updated 2 years ago
- Toolkit to help understand "what lies" in word embeddings. Also benchmarking!☆477Updated 2 years ago
- Fixes contractions such as `you're` to `you are`☆320Updated 3 years ago
- Evaluation of Sentence Representations in Polish☆23Updated 3 years ago
- Natural language processing course thought at AGH University of Science and Technology☆69Updated last month
- Ten Thousand German News Articles Dataset for Topic Classification☆86Updated 3 years ago
- ☆30Updated 3 years ago
- GilBERTo: A pretrained language model based on RoBERTa for Italian☆73Updated 6 years ago
- A Greek edition of BERT pre-trained language model☆149Updated last year
- Pre-trained Nordic models for BERT☆175Updated 4 years ago
- BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s …☆141Updated 2 years ago
- 🌸 fastText + Bloom embeddings for compact, full-coverage vectors with spaCy☆329Updated 8 months ago
- ☆76Updated 2 years ago
- Natural Intelligence is still a pretty good idea.☆823Updated last year
- NeuSpell: A Neural Spelling Correction Toolkit☆702Updated 2 years ago
- skweak: A software toolkit for weak supervision applied to NLP tasks☆926Updated last year
- DaNLP is a repository for Natural Language Processing resources for the Danish Language.☆207Updated 10 months ago
- 🐦 Quickly annotate data from the comfort of your Jupyter notebook☆788Updated last year
- Doubt your data, find bad labels.☆517Updated last year
- AlBERTo the first italian BERT model for Twitter languange understanding☆72Updated 5 years ago