DunZhang / KnowledgeDistillationLinks
A general framework for knowledge distillation
☆54Updated 4 years ago
Alternatives and similar repositories for KnowledgeDistillation
Users that are interested in KnowledgeDistillation are comparing it to the libraries listed below
Sorting:
- Adversarial Training for NLP in Keras☆46Updated 5 years ago
- pytorch implementation for Patient Knowledge Distillation for BERT Model Compression☆203Updated 5 years ago
- Official implementation of AAAI-21 paper "Label Confusion Learning to Enhance Text Classification Models"☆116Updated 2 years ago
- Distilling Task-Specific Knowledge from BERT into Simple Neural Networks.☆13Updated 4 years ago
- A PyTorch implementation of "Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation"☆56Updated 5 years ago
- R-Drop方法在中文任务上的简单实验☆91Updated 3 years ago
- Implementation of papers for text classification task on SST-1/SST-2☆67Updated 11 months ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆193Updated 3 years ago
- Codes for "Learning Sparse Sharing Architectures for Multiple Tasks"☆95Updated 4 years ago
- a simple pytorch implement of Multi-Sample Dropout☆57Updated 5 years ago
- NLP中文预训练模型泛化能力挑战赛☆42Updated 4 years ago
- Knowledge Distillation from BERT☆52Updated 6 years ago
- Code for AAAI2021 paper: Few-Shot Learning for Multi-label Intent Detection.☆108Updated 3 years ago
- Learning To Compare For Text , Few shot learning in text classification☆42Updated 5 years ago
- BERT distillation(基于BERT的蒸馏实验 )☆313Updated 4 years ago
- Exploring mixup strategies for text classification☆31Updated 4 years ago
- ☆40Updated 3 years ago
- Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。☆225Updated 2 years ago
- 全球人工智能技术创新大赛-赛道三:小布助手对话短文本语义匹配☆37Updated 4 years ago
- pytorch版simcse无监督语义相似模型☆22Updated 4 years ago
- classical model code implementation of few-shot/one-shot lenaring, including siamese network, prototypical network, relation network, ind…☆135Updated 5 years ago
- 天池人工智能创新赛3-ch12hu团队周星星分享☆26Updated 4 years ago
- UDA(Unsupervised Data Augmentation) implemented by pytorch☆278Updated 5 years ago
- PyTorch implementations of algorithms for knowledge distillation.☆57Updated 5 years ago
- 简洁易用版TinyBert:基于Bert进行知识蒸馏的预训练语言模型☆266Updated 4 years ago
- a beautiful method for cluster or community detection☆50Updated 5 years ago
- 这是使用pytoch 实现的长文本分类器☆45Updated 5 years ago
- ☆19Updated 5 years ago
- bert annotation, input and output for people from scratch, 代码注释, 有每一步的输入和输出, 适合初学者☆93Updated 2 years ago
- This repository contains the code for our paper [Enhancing Label Correlation Feedback in Multi-Label Text Classification via Multi-Task L…☆33Updated 3 years ago