imirzadeh / Teacher-Assistant-Knowledge-DistillationLinks
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
☆259Updated 5 years ago
Alternatives and similar repositories for Teacher-Assistant-Knowledge-Distillation
Users that are interested in Teacher-Assistant-Knowledge-Distillation are comparing it to the libraries listed below
Sorting:
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆105Updated 5 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 5 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆403Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Updated 2 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- FitNets: Hints for Thin Deep Nets☆208Updated 10 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆418Updated 5 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 5 years ago
- A large scale study of Knowledge Distillation.☆220Updated 5 years ago
- TensorFlow Implementation of Deep Mutual Learning☆322Updated 7 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression