SforAiDl / KD_LibLinks
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
ā634Updated 2 years ago
Alternatives and similar repositories for KD_Lib
Users that are interested in KD_Lib are comparing it to the libraries listed below
Sorting:
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. š26 knowledge distillation methods pā¦ā1,526Updated 2 weeks ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularizationā584Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.ā1,706Updated 3 years ago
- A large scale study of Knowledge Distillation.ā220Updated 5 years ago
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.ā430Updated 2 years ago
- NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch. Find explanation at tourdeml.github.io/blog/ā347Updated last year
- š Toolbox to extend PyTorch functionalitiesā421Updated last year
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibilityā1,954Updated 2 years ago
- Code for Noisy Student Training. https://arxiv.org/abs/1911.04252ā762Updated 4 years ago
- Escaping the Big Data Paradigm with Compact Transformers, 2021 (Train your Vision Transformers in 30 mins on CIFAR-10 with a single GPU!)ā534Updated 8 months ago
- ā602Updated 3 weeks ago
- Open-source code for paper "Dataset Distillation"ā810Updated 3 weeks ago
- Estimate/count FLOPS for a given neural network using pytorchā305Updated 3 years ago
- AugMix: A Simple Data Processing Method to Improve Robustness and Uncertaintyā990Updated last month
- Awesome machine learning model compression research papers, quantization, tools, and learning material.ā526Updated 9 months ago
- A general and accurate MACs / FLOPs profiler for PyTorch modelsā620Updated last year
- knowledge distillation papersā757Updated 2 years ago
- Unofficial PyTorch implementation of "Meta Pseudo Labels"ā387Updated last year
- Compare neural networks by their feature similarityā365Updated 2 years ago
- MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.ā696Updated 3 years ago
- Learning Rate Warmup in PyTorchā411Updated 3 weeks ago
- Official Pytorch Implementation of: "ImageNet-21K Pretraining for the Masses"(NeurIPS, 2021) paperā768Updated 2 years ago
- Is the attention layer even necessary? (https://arxiv.org/abs/2105.02723)ā486Updated 4 years ago
- Collection of recent methods on (deep) neural network compression and acceleration.ā948Updated 3 months ago
- Unofficial PyTorch Reimplementation of RandAugment.ā637Updated 2 years ago
- Official Pytorch Implementation of "TResNet: High-Performance GPU-Dedicated Architecture" (WACV 2021)ā475Updated 7 months ago
- Pytorch implementation of the paper "Class-Balanced Loss Based on Effective Number of Samples"ā800Updated last year
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)ā418Updated 5 years ago
- Summary, Code for Deep Neural Network Quantizationā549Updated last month
- (ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"ā818Updated 3 years ago