karanchahal / distiller
A large scale study of Knowledge Distillation.
☆217Updated 4 years ago
Related projects ⓘ
Alternatives and complementary repositories for distiller
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆267Updated 4 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆256Updated 5 years ago
- Implementations of ideas from recent papers☆391Updated 3 years ago
- Unofficial PyTorch Implementation of Unsupervised Data Augmentation.☆147Updated 4 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆195Updated 4 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- Papers for deep neural network compression and acceleration☆394Updated 3 years ago
- A PyTorch implementation of " EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks."☆309Updated 4 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization