thaonguyen19 / ModelDistillation-PyTorch
PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression
☆58Updated 7 years ago
Alternatives and similar repositories for ModelDistillation-PyTorch:
Users that are interested in ModelDistillation-PyTorch are comparing it to the libraries listed below
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 5 years ago
- Code for “Discrimination-aware-Channel-Pruning-for-Deep-Neural-Networks”☆184Updated 4 years ago
- Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626☆177Updated 2 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆257Updated 5 years ago
- FitNets: Hints for Thin Deep Nets☆206Updated 9 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- Pytorch Implementationg of “Learning Efficient Convolutional Networks through Network Slimming”☆77Updated 6 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆71Updated 5 years ago
- Base to channel pruned to ResNet18 model☆145Updated 2 years ago
- PyTorch Implementation of Weights Pruning☆185Updated 7 years ago
- PyTorch implementation for GAL.☆56Updated 4 years ago
- Codes of Centripetal SGD☆63Updated 2 years ago
- ☆135Updated 6 years ago
- MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.☆354Updated 4 years ago
- a pytorch implement of mobileNet v2 on cifar10☆62Updated 2 years ago
- Teaches a student network from the knowledge obtained via training of a larger teacher network☆159Updated 7 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆236Updated 2 years ago
- TensorFlow Implementation of Deep Mutual Learning☆322Updated 7 years ago
- Exploring CNNs and model quantization on Caltech-256 dataset☆85Updated 7 years ago
- ☆38Updated 6 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆264Updated 5 years ago
- ☆22Updated 2 years ago
- ☆166Updated 2 years ago
- BlockDrop: Dynamic Inference Paths in Residual Networks☆142Updated 2 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆167Updated 4 years ago
- PyTorch implementation of "Pruning Filters For Efficient ConvNets"☆150Updated last year
- Implementation of Data-free Knowledge Distillation for Deep Neural Networks (on arxiv!)☆81Updated 7 years ago
- Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks☆381Updated 5 years ago