ShenDezhou / Chinese-PreTrained-BERT

We released BERT-wwm, a Chinese pre-training model based on Whole Word Masking technology, and models closely related to this technology. 我们发布了基于全词遮罩(Whole Word Masking)技术的中文预训练模型BERT-wwm,以及与此技术密切相关的模型
60Updated last year

Alternatives and similar repositories for Chinese-PreTrained-BERT:

Users that are interested in Chinese-PreTrained-BERT are comparing it to the libraries listed below