karanchahal / distillerLinks
A large scale study of Knowledge Distillation.
☆220Updated 5 years ago
Alternatives and similar repositories for distiller
Users that are interested in distiller are comparing it to the libraries listed below
Sorting:
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆259Updated 5 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 5 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆105Updated 5 years ago
- lookahead optimizer (Lookahead Optimizer: k steps forward, 1 step back) for pytorch☆337Updated 5 years ago
- Implementation and experiments for AdamW on Pytorch☆94Updated 5 years ago
- Unofficial PyTorch Implementation of Unsupervised Data Augmentation.☆147Updated 4 years ago
- Sparse learning library and sparse momentum resources.☆381Updated 3 years ago
- Code for reproducing Manifold Mixup results (ICML 2019)☆493Updated last year
- PyTorch code for softmax variants: center loss, cosface loss, large-margin gaussian mixture, COCOLoss, ring loss☆255Updated 7 years ago
- A specially designed light version of Fast AutoAugment☆171Updated 5 years ago
- Implementations of ideas from recent papers☆392Updated 4 years ago
- Example code showing how to use Nvidia DALI in pytorch, with fallback to torchvision. Contains a few differences to the official Nvidia …☆197Updated 5 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Updated 2 years ago
- Useful PyTorch functions and modules that are not implemented in PyTorch by default☆188Updated last year
- Papers for deep neural network compression and acceleration☆399Updated 4 years ago
- Mish Deep Learning Activation Function for PyTorch / FastAI☆161Updated 5 years ago
- Learning Confidence for Out-of-Distribution Detection in Neural Networks☆275Updated 7 years ago
- pytorch implement of Lookahead Optimizer☆191Updated 3 years ago
- [ICLR 2020] NAS evaluation is frustratingly hard☆149Updated last year
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆634Updated 2 years ago
- 🛠 Toolbox to extend PyTorch functionalities☆421Updated last year
- A Re-implementation of Fixed-update Initialization☆153Updated 6 years ago
- Code for NeurIPS 2019 Paper, "L_DMI: An Information-theoretic Noise-robust Loss Function"☆119Updated 2 years ago
- A PyTorch implementation of " EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks."☆313Updated 5 years ago
- Official Implementation of ICML 2019 Unsupervised label noise modeling and loss correction☆223Updated 4 years ago
- homura is a library for fast prototyping DL research☆106Updated 3 years ago
- Code for https://arxiv.org/abs/1810.04622☆141Updated 5 years ago
- Zero-Shot Knowledge Distillation in Deep Networks☆67Updated 3 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆402Updated 4 years ago