Law-AI / pretraining-bert

This repository contains the codes for pre-training a BERT-base model on a large, un-annotated corpus of text using dynamic Masked Language Modeling (MLM) and dynamic Next Sentence Prediction (NSP).
12Updated last year

Alternatives and similar repositories for pretraining-bert:

Users that are interested in pretraining-bert are comparing it to the libraries listed below