karanchahal / distiller
A large scale study of Knowledge Distillation.
☆219Updated 4 years ago
Alternatives and similar repositories for distiller:
Users that are interested in distiller are comparing it to the libraries listed below
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆264Updated 5 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆257Updated 5 years ago
- Papers for deep neural network compression and acceleration☆397Updated 3 years ago
- Unofficial PyTorch Implementation of Unsupervised Data Augmentation.☆146Updated 4 years ago
- Implementations of ideas from recent papers☆392Updated 4 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆619Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆586Updated 2 years ago
- lookahead optimizer (Lookahead Optimizer: k steps forward, 1 step back) for pytorch☆334Updated 5 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆395Updated 3 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆417Updated 4 years ago
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆331Updated 8 months ago
- Pruning Neural Networks with Taylor criterion in Pytorch☆315Updated 5 years ago
- ConvNet training using pytorch☆345Updated 4 years ago
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.☆428Updated last year
- Zero-Shot Knowledge Distillation in Deep Networks☆65Updated 2 years ago
- 🛠 Toolbox to extend PyTorch functionalities☆418Updated 10 months ago
- Example code showing how to use Nvidia DALI in pytorch, with fallback to torchvision. Contains a few differences to the official Nvidia …☆197Updated 5 years ago
- Open-source code for paper "Dataset Distillation"☆792Updated 2 years ago
- PyTorch layer-by-layer model profiler☆606Updated 3 years ago
- Code for reproducing Manifold Mixup results (ICML 2019)☆487Updated 11 months ago
- Code for MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks☆324Updated 2 years ago
- Virtual Adversarial Training (VAT) implementation for PyTorch☆296Updated 6 years ago
- Code release for paper "Random Search and Reproducibility for NAS"☆167Updated 5 years ago
- ☆174Updated 8 months ago
- Implementation of the mixup training method☆466Updated 6 years ago
- A PyTorch implementation of " EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks."☆311Updated 5 years ago
- A curated list of resources about few-shot and one-shot learning☆282Updated 5 years ago
- MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.☆698Updated 3 years ago