ShenDezhou / Chinese-PreTrained-BERT

We released BERT-wwm, a Chinese pre-training model based on Whole Word Masking technology, and models closely related to this technology. 我们发布了基于全词遮罩(Whole Word Masking)技术的中文预训练模型BERT-wwm,以及与此技术密切相关的模型
58Updated last year

Related projects

Alternatives and complementary repositories for Chinese-PreTrained-BERT