imirzadeh / Teacher-Assistant-Knowledge-Distillation
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
☆258Updated 5 years ago
Related projects ⓘ
Alternatives and complementary repositories for Teacher-Assistant-Knowledge-Distillation
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆234Updated last year
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆395Updated 3 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆580Updated last year
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆266Updated 5 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 4 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆414Updated 4 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- FitNets: Hints for Thin Deep Nets☆204Updated 9 years ago
- Unofficial PyTorch Implementation of Unsupervised Data Augmentation.☆147Updated 4 years ago
- A large scale study of Knowledge Distillation.☆217Updated 4 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 4 years ago
- TensorFlow Implementation of Deep Mutual Learning☆320Updated 6 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression☆59Updated 7 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆70Updated 5 years ago
- PyTorch implementation for Channel Distillation☆100Updated 4 years ago
- MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.☆351Updated 4 years ago
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆330Updated 3 months ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆72Updated last year
- BlockDrop: Dynamic Inference Paths in Residual Networks☆140Updated last year
- Code for SkipNet: Learning Dynamic Routing in Convolutional Networks (ECCV 2018)☆236Updated 5 years ago
- Improving Consistency-Based Semi-Supervised Learning with Weight Averaging☆185Updated 5 years ago
- [ICLR 2020]: 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'☆223Updated 4 years ago
- Implementation of the mixup training method☆465Updated 6 years ago
- Pruning Neural Networks with Taylor criterion in Pytorch☆314Updated 5 years ago
- Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks☆376Updated 5 years ago
- PyTorch code for softmax variants: center loss, cosface loss, large-margin gaussian mixture, COCOLoss, ring loss☆252Updated 6 years ago
- ☆33Updated last year
- Knowledge Distillation using Tensorflow☆142Updated 5 years ago
- PyTorch implementation for Convolutional Networks with Adaptive Inference Graphs☆185Updated 6 years ago
- When Does Label Smoothing Help?_pytorch_implementationimp☆124Updated 4 years ago