imirzadeh / Teacher-Assistant-Knowledge-Distillation
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
☆257Updated 5 years ago
Alternatives and similar repositories for Teacher-Assistant-Knowledge-Distillation:
Users that are interested in Teacher-Assistant-Knowledge-Distillation are comparing it to the libraries listed below
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆264Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆236Updated 2 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆395Updated 3 years ago
- FitNets: Hints for Thin Deep Nets☆205Updated 9 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆417Updated 4 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 5 years ago
- MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.☆354Updated 4 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆586Updated 2 years ago
- A large scale study of Knowledge Distillation.☆219Updated 4 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression☆59Updated 7 years ago
- Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks☆379Updated 5 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- Pruning Neural Networks with Taylor criterion in Pytorch☆315Updated 5 years ago
- TensorFlow Implementation of Deep Mutual Learning☆321Updated 6 years ago
- Knowledge Distillation using Tensorflow☆142Updated 5 years ago
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆331Updated 8 months ago
- [ICLR 2020]: 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'☆222Updated 4 years ago
- PyTorch implementation of AutoAugment.☆159Updated 4 years ago
- Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626☆177Updated 2 years ago
- PyTorch Implementation of Weights Pruning☆185Updated 7 years ago
- [ECCV'20 Oral] MutualNet: Adaptive ConvNet via Mutual Learning from Network Width and Resolution☆158Updated 2 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆70Updated 5 years ago
- PyTorch implementation for Convolutional Networks with Adaptive Inference Graphs☆185Updated 6 years ago
- PyTorch implementation for Channel Distillation☆100Updated 4 years ago
- BlockDrop: Dynamic Inference Paths in Residual Networks☆142Updated 2 years ago
- Code for SkipNet: Learning Dynamic Routing in Convolutional Networks (ECCV 2018)☆239Updated 5 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆73Updated last year
- Fast Neural Network Adaptation via Parameter Remapping and Architecture Search (ICLR2020 & TPAMI)☆166Updated 3 years ago
- Papers for deep neural network compression and acceleration☆397Updated 3 years ago