bojone / r-dropLinks
R-Drop方法在中文任务上的简单实验
☆91Updated 3 years ago
Alternatives and similar repositories for r-drop
Users that are interested in r-drop are comparing it to the libraries listed below
Sorting:
- Pattern-Exploiting Training在中文上的简单实验☆171Updated 4 years ago
- Label Mask for Multi-label Classification☆56Updated 3 years ago
- Official implementation of AAAI-21 paper "Label Confusion Learning to Enhance Text Classification Models"☆115Updated 2 years ago
- P-tuning方法在中文上的简单实验☆139Updated 4 years ago
- ☆87Updated 3 years ago
- WoBERT_pytorch☆40Updated 4 years ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆99Updated 2 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆115Updated 2 years ago
- 基于“Seq2Seq+前缀树”的知识图谱问答☆71Updated 3 years ago
- Apply the Circular to the Pretraining Model☆37Updated 3 years ago
- 中文无监督SimCSE Pytorch实现☆134Updated 3 years ago
- 实验苏神的CoSENT的Torch实现☆32Updated 3 years ago
- ☆34Updated 4 years ago
- 全球人工智能技术创新大赛-赛道三:小布助手对话短文本语义匹配☆37Updated 4 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆194Updated 3 years ago
- RoFormer升级版☆152Updated 2 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- TIANCHI-小布助手对话短文本语义匹配BERT baseline☆32Updated 4 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago
- ☆91Updated 5 years ago
- Negative sampling for solving the unlabeled entity problem in NER. ICLR-2021 paper: Empirical Analysis of Unlabeled Entity Problem in Nam…☆134Updated 3 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- ☆127Updated 2 years ago
- NLP实验:新词挖掘+预训练模型继续Pre-training☆47Updated last year
- NLP中文预训练模型泛化能力挑战赛☆42Updated 4 years ago
- Data Augmentation with a Generation Approach for Low-resource Tagging Tasks☆80Updated 4 years ago
- chinese version of longformer☆113Updated 4 years ago
- pytorch版simcse无监督语义相似模型☆22Updated 4 years ago
- ☆32Updated 3 years ago
- 中文数据集下SimCSE+ESimCSE的实现☆192Updated 3 years ago