imirzadeh / Teacher-Assistant-Knowledge-DistillationView external linksLinks
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
☆264Oct 3, 2019Updated 6 years ago
Alternatives and similar repositories for Teacher-Assistant-Knowledge-Distillation
Users that are interested in Teacher-Assistant-Knowledge-Distillation are comparing it to the libraries listed below
Sorting:
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Feb 15, 2023Updated 2 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019