HoyTta0 / KnowledgeDistillation

Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
213Updated 2 years ago

Related projects

Alternatives and complementary repositories for KnowledgeDistillation