HoyTta0 / KnowledgeDistillationLinks

Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
223Updated 2 years ago

Alternatives and similar repositories for KnowledgeDistillation

Users that are interested in KnowledgeDistillation are comparing it to the libraries listed below

Sorting: