wietsedv / bertjeLinks

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
138Updated 2 years ago

Alternatives and similar repositories for bertje

Users that are interested in bertje are comparing it to the libraries listed below

Sorting: