thaonguyen19 / ModelDistillation-PyTorch
PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression
☆59Updated 6 years ago
Related projects: ⓘ
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 4 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆70Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- Tools for computing model parameters and FLOPs.☆86Updated 5 years ago
- Code for “Discrimination-aware-Channel-Pruning-for-Deep-Neural-Networks”☆183Updated 3 years ago
- PyTorch implementation for GAL.☆55Updated 4 years ago
- Pytorch Implementationg of “Learning Efficient Convolutional Networks through Network Slimming”☆76Updated 5 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆254Updated 4 years ago
- Teaches a student network from the knowledge obtained via training of a larger teacher network☆154Updated 6 years ago
- [CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning☆92Updated last year
- Exploring CNNs and model quantization on Caltech-256 dataset☆84Updated 6 years ago
- Global Sparse Momentum SGD for pruning very deep neural networks☆43Updated 2 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆72Updated last year
- ☆38Updated 6 years ago
- TensorFlow Implementation of Deep Mutual Learning☆318Updated 6 years ago
- Base to channel pruned to ResNet18 model☆143Updated last year
- Towards Optimal Structured CNN Pruning via Generative Adversarial Learning☆15Updated 5 years ago
- Learning Metrics from Teachers: Compact Networks for Image Embedding (CVPR19)☆76Updated 5 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆156Updated 3 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆194Updated 4 years ago
- Pytorch implementation of SNAS☆75Updated 5 years ago
- ☆165Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆233Updated last year
- Codes of Centripetal SGD☆62Updated 2 years ago
- Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626☆174Updated last year
- ☆25Updated 5 years ago
- ☆143Updated this week
- DELTA: DEep Learning Transfer using Feature Map with Attention for Convolutional Networks https://arxiv.org/abs/1901.09229☆66Updated 3 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 4 years ago
- FitNets: Hints for Thin Deep Nets☆202Updated 9 years ago