karanchahal / distillerLinks
A large scale study of Knowledge Distillation.
☆220Updated 5 years ago
Alternatives and similar repositories for distiller
Users that are interested in distiller are comparing it to the libraries listed below
Sorting:
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆259Updated 5 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆105Updated 5 years ago
- Unofficial PyTorch Implementation of Unsupervised Data Augmentation.☆147Updated 4 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- Implementation and experiments for AdamW on Pytorch