Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
☆229Jul 27, 2022Updated 3 years ago
Alternatives and similar repositories for KnowledgeDistillation
Users that are interested in KnowledgeDistillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- BERT distillation(基于BERT的蒸馏实验 )☆316Jul 30, 2020Updated 5 years ago
- bert蒸馏实践,包含BiLSTM蒸馏BERT和TinyBert☆13Apr 23, 2022Updated 3 years ago
- 简洁易用版TinyBert:基于Bert进行知识蒸馏的预训练语言模型☆272Oct 24, 2020Updated 5 years ago
- knowledge distillation: 采用知识蒸馏,训练bert后指导textcnn☆19Apr 29, 2021Updated 4 years ago
- 基于Pytorch的知识蒸馏(中文文本分类)☆20Jan 12, 2023Updated 3 years ago
- GPU virtual machines on DigitalOcean Gradient AI • AdGet to production fast with high-performance AMD and NVIDIA GPUs you can spin up in seconds. The definition of operational simplicity.
- 复现论文《Distilling Task-Specific Knowledge from BERT into Simple Neural Networks》☆16Jun 13, 2021Updated 4 years ago
- Distilling Task-Specific Knowledge from BERT into Simple Neural Networks.