guotong1988 / BERT-pre-training

multi-gpu pre-training in one machine for BERT without horovod (Data Parallelism)
172Updated 2 months ago

Alternatives and similar repositories for BERT-pre-training

Users that are interested in BERT-pre-training are comparing it to the libraries listed below

Sorting: