Vincent131499 / Language_Understanding_based_BERT

基于BERT的预训练语言模型实现,分为两步:预训练和微调。目前已包括BERT、Roberta、ALbert三个模型,且皆可支持Whole Word Mask模式。
16Updated 4 years ago

Alternatives and similar repositories for Language_Understanding_based_BERT:

Users that are interested in Language_Understanding_based_BERT are comparing it to the libraries listed below