xueyouluo / S2S-in-ProductionLinks
分享一些S2S在实际应用中遇到的问题和解决方法。
☆27Updated 5 years ago
Alternatives and similar repositories for S2S-in-Production
Users that are interested in S2S-in-Production are comparing it to the libraries listed below
Sorting:
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago
- A PyTorch implementation of "Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation"☆56Updated 5 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆193Updated 3 years ago
- 无监督文本生成的一些方法☆48Updated 4 years ago
- CTC2021-中文文本纠错大赛的SOTA方案及在线演示☆72Updated 2 years ago
- 2020语言与智能技术竞赛:面向推荐的对话任务☆52Updated 4 years ago
- Negative sampling for solving the unlabeled entity problem in NER. ICLR-2021 paper: Empirical Analysis of Unlabeled Entity Problem in Nam…☆134Updated 3 years ago
- R-Drop方法在中文任务上的简单实验☆91Updated 3 years ago
- The code for "A Unified MRC Framework for Named Entity Recognition"☆33Updated 5 years ago
- [ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering☆120Updated 2 years ago
- ☆59Updated 5 years ago
- 基于“Seq2Seq+前缀树”的知识图谱问答☆71Updated 3 years ago
- ☆51Updated 3 years ago
- CCL2019,“小牛杯”中文幽默计算任务的数据集及baseline☆23Updated 11 months ago
- CLUE baseline pytorch CLUE的pytorch版本基线☆75Updated 5 years ago
- This is the repository for NLPCC2020 task AutoIE☆51Updated 5 years ago
- 全球人工智能技术创新大赛-赛道三:小布助手对话短文本语义匹配☆37Updated 4 years ago
- 基于BERT的无监督分词和句法分析☆110Updated 5 years ago
- The dataset and the evaluation tool for NLPCC2018 Shared Task2--Grammatical Error Correction (GEC).☆55Updated 3 years ago
- 24*2个预训练的小型BERT模型,NLPer炼丹利器☆50Updated 5 years ago
- UNF(Universal NLP Framework)☆71Updated 5 years ago
- tensorflow version of bert-of-theseus☆62Updated 4 years ago
- Knowledge Distillation from BERT☆53Updated 6 years ago
- CCL2020,“小牛杯”幽默计算任务数据发布☆22Updated 11 months ago
- ☆89Updated 5 years ago
- pytorch版simcse无监督语义相似模型☆22Updated 4 years ago
- 该repo可用于将OntoNotes-5.0转换为Conll格式☆132Updated 2 years ago
- The very easy BERT pretrain process by using tokenizers and transformers repos☆31Updated 5 years ago
- ☆47Updated 4 years ago
- 中文版unilm预训练模型☆82Updated 4 years ago