imirzadeh / Teacher-Assistant-Knowledge-DistillationLinks
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
☆259Updated 5 years ago
Alternatives and similar repositories for Teacher-Assistant-Knowledge-Distillation
Users that are interested in Teacher-Assistant-Knowledge-Distillation are comparing it to the libraries listed below
Sorting:
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆105Updated 5 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆402Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 5 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Updated 2 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆418Updated 5 years ago
- FitNets: Hints for Thin Deep Nets☆208Updated 10 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression☆58Updated 7 years ago
- A large scale study of Knowledge Distillation.☆220Updated 5 years ago
- TensorFlow Implementation of Deep Mutual Learning☆323Updated 7 years ago
- Pytorch implementation of SNAS☆75Updated 6 years ago
- The Pytorch Implementation of L-Softmax☆188Updated 6 years ago
- BlockDrop: Dynamic Inference Paths in Residual Networks☆142Updated 2 years ago
- [ICLR 2020]: 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'☆222Updated 5 years ago
- MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.☆354Updated 5 years ago
- PyTorch code for softmax variants: center loss, cosface loss, large-margin gaussian mixture, COCOLoss, ring loss☆255Updated 7 years ago
- PyTorch implementation for Channel Distillation☆102Updated 5 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆71Updated 5 years ago
- Code for “Discrimination-aware-Channel-Pruning-for-Deep-Neural-Networks”☆183Updated 4 years ago
- Unofficial PyTorch Implementation of Unsupervised Data Augmentation.☆147Updated 4 years ago
- [ECCV'20 Oral] MutualNet: Adaptive ConvNet via Mutual Learning from Network Width and Resolution☆159Updated 2 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated 2 years ago
- [ICLR 2020] NAS evaluation is frustratingly hard☆149Updated last year
- Teaches a student network from the knowledge obtained via training of a larger teacher network☆158Updated 7 years ago
- ☆166Updated 2 years ago
- Implementation with latest PyTorch (v1.1) for multi-gpu DARTS https://arxiv.org/abs/1806.09055☆83Updated 5 years ago
- Densely Connected Search Space for More Flexible Neural Architecture Search (CVPR2020)☆294Updated 5 years ago
- MSDNet☆190Updated 3 years ago