yuanli2333 / Teacher-free-Knowledge-DistillationLinks
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
☆584Updated 2 years ago
Alternatives and similar repositories for Teacher-free-Knowledge-Distillation
Users that are interested in Teacher-free-Knowledge-Distillation are comparing it to the libraries listed below
Sorting:
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆418Updated 5 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆402Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf