HoyTta0 / KnowledgeDistillationLinks
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
☆230Updated 3 years ago
Alternatives and similar repositories for KnowledgeDistillation
Users that are interested in KnowledgeDistillation are comparing it to the libraries listed below
Sorting:
- 简洁易用版TinyBert:基于Bert进行知识蒸馏的预训练语言模型☆268Updated 5 years ago
- ☆279Updated 3 years ago
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆261Updated 4 years ago
- bert pytorch模型微调用于的多标签文本分类☆137Updated 6 years ago
- pytorch中文语言模型预训练☆387Updated 5 years ago
- 中文无监督SimCSE Pytorch实现☆135Updated 4 years ago
- 天池大赛疫情文本挑战赛线上第三名方案分享☆228Updated 4 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆179Updated 3 years ago
- Knowledge Graph☆176Updated 3 years ago
- 从头训练MASK BERT☆139Updated 2 years ago
- 基于Transformers的文本分类☆341Updated 4 years ago
- 2020 CCF大数据与计算智能大赛-非结构化商业文本信息中隐私信息识别-第7名方案☆73Updated 4 years ago
- 基于pytorch_bert的中文多标签分类☆92Updated 4 years ago
- CoSENT、STS、SentenceBERT☆171Updated 9 months ago
- Pytorch Bert Finetune in Chinese Text Classification☆221Updated last year
- 文本分类baseline:BERT、半监督学习UDA、对抗学习、数据增强☆104Updated 4 years ago
- 全局指针统一处理嵌套与非嵌套NER☆257Updated 4 years ago
- BERT distillation(基于BERT的蒸馏实验 )☆315Updated 5 years ago
- ☆88Updated 4 years ago
- SimCSE在中文上的复现,有监督+无监督☆281Updated 9 months ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 5 years ago
- experiments of some semantic matching models and comparison of experimental results.☆163Updated last month
- 中文数据集下SimCSE+ESimCSE的实现☆193Updated 3 years ago
- 论文复现《Named Entity Recognition as Dependency Parsing》☆131Updated 4 years ago
- TIANCHI-小布助手对话短文本语义匹配BERT baseline☆32Updated 4 years ago
- 基于词汇信息融合的中文NER模型☆170Updated 3 years ago
- ccf 2020 qa match competition top1☆268Updated 4 years ago
- Hugging BERT together. Misc scripts for Huggingface transformers.☆73Updated 2 years ago
- ☆136Updated 4 years ago
- 天池 新冠疫情相似句对判定大赛 top6方案☆77Updated 3 years ago