PERT: Pre-training BERT with Permuted Language Model
☆367Jul 15, 2025Updated 8 months ago
Alternatives and similar repositories for PERT
Users that are interested in PERT are comparing it to the libraries listed below
Sorting:
- ExpMRC: Explainability Evaluation for Machine Reading Comprehension☆62Aug 30, 2023Updated 2 years ago
- LERT: A Linguistically-motivated Pre-trained Language Model(语言学信息增强的预训练模型LERT)☆223Jul 15, 2025Updated 8 months ago
- Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)☆705Jul 15, 2025Updated 8 months ago
- Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)☆1,439Jul 15, 2025Updated 8 months ago
- Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)☆10,183Jul 15, 2025Updated 8 months ago
- Cross-Lingual Machine Reading Comprehension (EMNLP 2019)☆67Nov 6, 2019Updated 6 years ago
- A PyTorch-based model pruning toolkit for pre-trained language models☆389Aug 31, 2023Updated 2 years ago
- A Sentence Cloze Dataset for Chinese Machine Reading Comprehension (CMRC 2019)☆126Oct 24, 2022Updated 3 years ago
- The First Evaluation Workshop on Chinese Machine Reading Comprehension (CMRC 2017)☆92May 28, 2019Updated 6 years ago
- A Span-Extraction Dataset for Chinese Machine Reading Comprehension (CMRC 2018)☆452Jun 15, 2022Updated 3 years ago
- VLE: Vision-Language Encoder (VLE: 视觉-语言多模态预训练模型)☆194Mar 13, 2023Updated 3 years ago
- 机器阅读理解 冠军/亚军代码及中文预训练MRC模型☆743Nov 19, 2022Updated 3 years ago
- Pre-Trained Chinese XLNet(中文XLNet预训练模型)☆1,648Jul 15, 2025Updated 8 months ago
- Collections of resources from Joint Laboratory of HIT and iFLYTEK Research (HFL)☆376Mar 9, 2023Updated 3 years ago
- MiniRBT (中文小型预训练模型系列)☆301Jul 15, 2025Updated 8 months ago
- Conversational Word Embedding for Retrieval-based Dialog System (ACL2020)☆30Sep 2, 2020Updated 5 years ago
- ☆10Sep 27, 2021Updated 4 years ago
- CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation☆496Dec 30, 2022Updated 3 years ago
- Pattern-Exploiting Training在中文上的简单实验☆173Oct 10, 2020Updated 5 years ago
- A PyTorch-based knowledge distillation toolkit for natural language processing☆1,697May 8, 2023Updated 2 years ago
- DataCLUE: 数据为中心的NLP基准和工具包☆144May 11, 2022Updated 3 years ago
- CINO: Pre-trained Language Models for Chinese Minority (少数民族语言预训练模型)☆262Jul 15, 2025Updated 8 months ago
- Code for ACL 2021 paper "ChineseBERT: Chinese Pretraining Enhanced by Glyph and Pinyin Information"☆566Jul 26, 2023Updated 2 years ago
- A 30000+ Chinese MRC dataset - Delta Reading Comprehension Dataset☆313Apr 21, 2020Updated 5 years ago
- code for ACL 2020 paper: FLAT: Chinese NER Using Flat-Lattice Transformer☆1,004May 10, 2022Updated 3 years ago
- RoBERTa中文预训练模型: RoBERTa for Chinese☆2,773Jul 22, 2024Updated last year
- Code used in our ijcai 2019 paper "Story Ending Prediction by Transferable BERT"☆24Nov 21, 2022Updated 3 years ago
- SimCSE在中文任务上的简单实验☆606Aug 7, 2023Updated 2 years ago
- CharBERT: Character-aware Pre-trained Language Model (COLING2020)☆121Jan 28, 2021Updated 5 years ago
- Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.☆3,155Jan 22, 2024Updated 2 years ago
- Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合☆5,535Feb 16, 2026Updated last month
- 中文生成式预训练模型☆99Aug 28, 2020Updated 5 years ago
- TripleNet: Triple Attention Network for Multi-Turn Response Selection in Retrieval-based Chatbots (CoNLL2019)☆26Nov 18, 2019Updated 6 years ago
- PromptCLUE, 全中文任务支持零样本学习模型☆665Jun 16, 2023Updated 2 years ago
- Code for our SIGIR 2022 accepted paper : P3 Ranker: Mitigating the Gaps between Pre-training and Ranking Fine-tuning with Prompt-based L…☆18Sep 24, 2023Updated 2 years ago
- Chinese MobileBERT(中文MobileBERT模型)☆97Mar 2, 2022Updated 4 years ago
- ☆272Jul 26, 2024Updated last year
- [EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821☆3,643Oct 16, 2024Updated last year
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆259Aug 13, 2021Updated 4 years ago