SforAiDl / KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
β608Updated last year
Related projects β
Alternatives and complementary repositories for KD_Lib
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularizationβ580Updated last year
- A coding-free framework built on PyTorch for reproducible deep learning studies. π25 knowledge distillation methods presented at CVPR, Iβ¦β1,392Updated last month
- Pytorch implementation of various Knowledge Distillation (KD) methods.β1,614Updated 2 years ago
- π Toolbox to extend PyTorch functionalitiesβ417Updated 6 months ago
- A general and accurate MACs / FLOPs profiler for PyTorch modelsβ571Updated 6 months ago
- Open-source code for paper "Dataset Distillation"β778Updated 2 years ago
- knowledge distillation papersβ741Updated last year
- NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch. Find explanation at tourdeml.github.io/blog/β345Updated 10 months ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibilityβ1,864Updated last year
- AugMix: A Simple Data Processing Method to Improve Robustness and Uncertaintyβ980Updated 3 months ago
- Unofficial PyTorch Reimplementation of RandAugment.β628Updated last year
- Pytorch implementation of the paper "Class-Balanced Loss Based on Effective Number of Samples"β785Updated 9 months ago
- Code for Noisy Student Training. https://arxiv.org/abs/1911.04252β753Updated 3 years ago
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.β424Updated last year
- A large scale study of Knowledge Distillation.β217Updated 4 years ago
- This repository contains code for the paper "Decoupling Representation and Classifier for Long-Tailed Recognition", published at ICLR 202β¦β946Updated 3 years ago
- mixup: Beyond Empirical Risk Minimizationβ1,168Updated 3 years ago
- SAM: Sharpness-Aware Minimization (PyTorch)β1,770Updated 9 months ago
- Compare neural networks by their feature similarityβ343Updated last year
- Learning Rate Warmup in PyTorchβ392Updated this week
- Estimate/count FLOPS for a given neural network using pytorchβ304Updated 2 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillβ¦β807Updated last year
- MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.β693Updated 2 years ago
- Awesome Knowledge-Distillation. εη±»ζ΄ηηη₯θ―θΈι¦paper(2014-2021)γβ2,497Updated last year
- β566Updated 3 weeks ago
- Awesome machine learning model compression research papers, quantization, tools, and learning material.β491Updated 2 months ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methodsβ2,197Updated last year
- A curated list of long-tailed recognition resources.β583Updated last year
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)β414Updated 4 years ago
- Gradually-Warmup Learning Rate Scheduler for PyTorchβ977Updated last month