ymcui / PERTLinks
PERT: Pre-training BERT with Permuted Language Model
☆361Updated 2 years ago
Alternatives and similar repositories for PERT
Users that are interested in PERT are comparing it to the libraries listed below
Sorting:
- LERT: A Linguistically-motivated Pre-trained Language Model(语言学信息增强的预训练模型LERT)☆214Updated 2 years ago
- Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)☆672Updated 2 years ago
- SimBERT升级版(SimBERTv2)!☆444Updated 3 years ago
- Code for ACL 2021 paper "ChineseBERT: Chinese Pretraining Enhanced by Glyph and Pinyin Information"☆559Updated last year
- ☆417Updated last year
- 中文生成式预训练模型☆567Updated 3 years ago
- Collections of resources from Joint Laboratory of HIT and iFLYTEK Research (HFL)☆370Updated 2 years ago
- 中文自然语言推理数据集(A large-scale Chinese Nature language inference and Semantic similarity calculation Dataset)☆430Updated 5 years ago
- CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation☆489Updated 2 years ago
- 端到端的长本文摘要模型(法研杯2020司法摘要赛道)☆398Updated last year
- Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料☆968Updated 2 years ago
- Modify Chinese text, modified on LaserTagger Model. 文本复述,基于lasertagger做中文文本数据增强。