thaonguyen19 / ModelDistillation-PyTorch
PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression
☆59Updated 7 years ago
Related projects ⓘ
Alternatives and complementary repositories for ModelDistillation-PyTorch
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 4 years ago
- Pytorch Implementationg of “Learning Efficient Convolutional Networks through Network Slimming”☆77Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626☆174Updated 2 years ago
- Tools for computing model parameters and FLOPs.☆86Updated 5 years ago
- ☆38Updated 6 years ago
- Base to channel pruned to ResNet18 model☆143Updated 2 years ago
- PyTorch implementation for GAL.☆55Updated 4 years ago
- Teaches a student network from the knowledge obtained via training of a larger teacher network☆156Updated 6 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆258Updated 5 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆70Updated 5 years ago
- pytorch implementation of network-in-network model on cifar10☆51Updated 5 years ago
- [CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning☆92Updated last year
- MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.☆351Updated 4 years ago
- ☆25Updated 5 years ago
- PyTorch implementation for Channel Distillation☆100Updated 4 years ago
- PyTorch Implementation of Weights Pruning☆184Updated 6 years ago
- Code for “Discrimination-aware-Channel-Pruning-for-Deep-Neural-Networks”☆184Updated 4 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆161Updated 4 years ago
- Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks☆376Updated 5 years ago
- ☆165Updated last year
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 4 years ago
- PyTorch implementation of "Pruning Filters For Efficient ConvNets"☆148Updated last year
- Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers https://arxiv.org/abs/1802.00124☆71Updated 6 years ago
- ☆49Updated 5 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆266Updated 5 years ago
- Towards Optimal Structured CNN Pruning via Generative Adversarial Learning☆15Updated 5 years ago
- Exploring CNNs and model quantization on Caltech-256 dataset☆84Updated 7 years ago
- Implementation of Data-free Knowledge Distillation for Deep Neural Networks (on arxiv!)☆79Updated 6 years ago
- Codes of Centripetal SGD☆62Updated 2 years ago