xv44586 / Knowledge-Distillation-NLPLinks
some demos of Knowledge Distillation in NLP
☆23Updated 4 years ago
Alternatives and similar repositories for Knowledge-Distillation-NLP
Users that are interested in Knowledge-Distillation-NLP are comparing it to the libraries listed below
Sorting:
- 本项目是NLP领域一些任务的基准模型实现,包括文本分类、命名实体识别、实体关系抽取、NL2SQL、CKBQA以及BERT的各种下游任务 应用。☆48Updated 4 years ago
- 对苏神的bert4keras的实现原理和矩阵运算进行详细的注释,方便学习;bert4keras链接:https://github.com/bojone/bert4keras☆42Updated 4 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆59Updated 4 years ago
- ☆92Updated 5 years ago
- ☆45Updated 4 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆76Updated 5 years ago
- 中文无监督SimCSE Pytorch实现☆135Updated 4 years ago
- Pattern-Exploiting Training在中文上的简单实验☆174Updated 5 years ago
- 小布助手对话短文本语义匹配的一个baseline☆139Updated 4 years ago
- ☆90Updated 5 years ago
- transformers implement (architecture, task example, serving and more)☆96Updated 3 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆128Updated 5 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆194Updated 3 years ago
- ccf 2020 qa match competition top1☆268Updated 4 years ago
- ☆280Updated 4 years ago
- 中文NLP数据集☆158Updated 6 years ago
- Label Mask for Multi-label Classification☆58Updated 4 years ago
- DIAC2019基于Adversarial Attack的问题等价性判别比赛☆82Updated 5 years ago
- ☆34Updated 4 years ago
- NLP的数据增强Demo☆48Updated 5 years ago
- 2019年达观杯智能信息抽取挑战赛获奖方案☆17Updated 5 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆90Updated 6 years ago
- 对话改写介绍文章☆98Updated 2 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆179Updated 3 years ago
- 法研杯2019 阅读理解赛道 top3☆151Updated 2 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 5 years ago
- Adversarial Attack文本匹配比赛☆42Updated 6 years ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆100Updated 3 years ago
- 天池大赛疫情文本挑战赛☆51Updated 5 years ago