karanchahal / distillerLinks
A large scale study of Knowledge Distillation.
☆220Updated 5 years ago
Alternatives and similar repositories for distiller
Users that are interested in distiller are comparing it to the libraries listed below
Sorting:
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆263Updated 6 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 6 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Updated 6 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- Unofficial PyTorch Implementation of Unsupervised Data Augmentation.☆148Updated 5 years ago
- A specially designed light version of Fast AutoAugment☆171Updated 5 years ago
- lookahead optimizer (Lookahead Optimizer: k steps forward, 1 step back) for pytorch☆338Updated 6 years ago
- pytorch implement of Lookahead Optimizer☆195Updated 3 years ago
- Implementation and experiments for AdamW on Pytorch☆94Updated 6 years ago
- A PyTorch implementation of " EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks."☆315Updated 6 years ago
- A machine learning experiment☆180Updated 8 years ago
- Implementations of ideas from recent papers☆392Updated 5 years ago
- The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API☆112Updated 3 years ago
- Code release for paper "Random Search and Reproducibility for NAS"☆167Updated 6 years ago
- Learning Confidence for Out-of-Distribution Detection in Neural Networks☆275Updated 7 years ago
- Example code showing how to use Nvidia DALI in pytorch, with fallback to torchvision. Contains a few differences to the official Nvidia …☆197Updated 5 years ago
- Papers for deep neural network compression and acceleration☆403Updated 4 years ago
- Sparse learning library and sparse momentum resources.☆384Updated 3 years ago
- homura is a library for fast prototyping DL research☆106Updated 3 years ago
- PyTorch implementation of AutoAugment.☆161Updated 5 years ago
- Knowledge Distillation using Tensorflow☆141Updated 6 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Updated 2 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆653Updated 2 years ago
- [ICLR 2020] NAS evaluation is frustratingly hard☆149Updated 2 years ago
- Code for MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks☆325Updated 2 years ago
- Zero-Shot Knowledge Distillation in Deep Networks☆67Updated 3 years ago
- Code for reproducing Manifold Mixup results (ICML 2019)☆495Updated last year
- Mish Deep Learning Activation Function for PyTorch / FastAI☆161Updated 5 years ago
- Unofficial PyTorch Implementation of EvoNorm☆123Updated 4 years ago
- Repository to track the progress in model compression and acceleration☆106Updated 4 years ago