HoyTta0 / KnowledgeDistillationLinks
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
☆227Updated 3 years ago
Alternatives and similar repositories for KnowledgeDistillation
Users that are interested in KnowledgeDistillation are comparing it to the libraries listed below
Sorting:
- 简洁易用版TinyBert:基于Bert进行知识蒸馏的预训练语言模型☆266Updated 4 years ago
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆262Updated 4 years ago
- ☆279Updated 3 years ago
- bert pytorch模型微调用于的多标签文本分类☆136Updated 5 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆178Updated 3 years ago
- 中文无监督SimCSE Pytorch实现☆135Updated 4 years ago
- 文本分类baseline:BERT、半监督学习UDA、对抗学习、数据增强☆104Updated 4 years ago
- pytorch中文语言模型预训练☆389Updated 5 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 5 years ago
- 天池大赛疫情文本挑战赛线上第三名方案分享☆228Updated 4 years ago
- 基于Transformers的文本分类☆342Updated 3 years ago
- Pytorch Bert Finetune in Chinese Text Classification☆221Updated last year
- 全球人工智能技术创新大赛-赛道三-冠军方案☆239Updated 4 years ago
- BERT distillation(基于BERT的蒸馏实验 )☆314Updated 5 years ago
- ☆157Updated 4 years ago
- 基于pytorch_bert的中文多标签分类☆91Updated 3 years ago
- SimCSE在中文上的复现,有监督+无监督☆278Updated 6 months ago
- 2020 CCF大数据与计算智能大赛-非结构化商业文本信息中隐私信息识别-第7名方案☆73Updated 4 years ago
- 从头训练MASK BERT☆138Updated 2 years ago
- ccf 2020 qa match competition top1☆267Updated 4 years ago
- 基于Pytorch+BERT+CRF的NLP序列标注模型,目前包括分词,词性标注,命名实体识别等☆62Updated 2 years ago
- CoSENT、STS、SentenceBERT☆170Updated 7 months ago
- ☆87Updated 3 years ago
- Pytorch进行长文本分类。这里用到的网络有:FastText、TextCNN、TextRNN、TextRCNN、Transformer☆48Updated 5 years ago
- Knowledge Graph☆175Updated 3 years ago
- 法研杯2019相似案例匹配第二名解决方案(附数据集和文档),CAIL2020/2021司法考试赛道冠军队伍☆250Updated 4 years ago
- 迭代膨胀卷积命名实体抽取☆45Updated 6 years ago
- 全局指针统一处理嵌套与非嵌套NER☆255Updated 4 years ago
- NER任务SOTA模型BERT_MRC☆61Updated last year
- A PyTorch-based toolkit for natural language processing☆159Updated 2 years ago