haitongli / knowledge-distillation-pytorchView external linksLinks
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
☆1,981Mar 25, 2023Updated 2 years ago
Alternatives and similar repositories for knowledge-distillation-pytorch
Users that are interested in knowledge-distillation-pytorch are comparing it to the libraries listed below
Sorting:
- Awesome Knowledge Distillation☆3,811Dec 25, 2025Updated last month
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,741Nov 25, 2021Updated 4 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Feb 15, 2023Updated 2 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆653Mar 1, 2023Updated 2 years ago
- Improving Convolutional Networks via Attention Transfer (ICLR 2017)☆1,463Jul 11, 2018Updated 7 years ago
- knowledge distillation papers☆767Feb 10, 2023Updated 3 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,590Dec 24, 2025Updated last month
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆264Oct 3, 2019Updated 6 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆414May 17, 2021Updated 4 years ago
- Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)☆1,516Jun 7, 2020Updated 5 years ago
- The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other ta…☆740Apr 20, 2020Updated 5 years ago
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆336Jul 25, 2024Updated last year
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆36,351Updated this week
- Count the MACs / FLOPs of your PyTorch model.☆5,081Jul 8, 2024Updated last year
- MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.☆701Dec 24, 2021Updated 4 years ago
- Implementation of CVPR 2019 paper: Distilling Object Detectors with Fine-grained Feature Imitation☆420Jul 15, 2021Updated 4 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- A PyTorch implementation of EfficientNet☆8,220Apr 8, 2022Updated 3 years ago
- Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)☆1,086May 2, 2024Updated last year
- Pretrained ConvNets for pytorch: NASNet, ResNeXt, ResNet, InceptionV4, InceptionResnetV2, Xception, DPN, etc.☆9,116Apr 22, 2022Updated 3 years ago
- [ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment☆1,939Dec 14, 2023Updated 2 years ago
- Automated deep learning algorithms implemented in PyTorch.☆1,584Apr 24, 2022Updated 3 years ago
- PyTorch implementation of MoCo: https://arxiv.org/abs/1911.05722☆5,115Feb 3, 2026Updated last week
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,910Jan 26, 2026Updated 2 weeks ago
- micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantiz…☆2,271May 6, 2025Updated 9 months ago
- Codebase for Image Classification Research, written in PyTorch.☆2,169Mar 20, 2024Updated last year
- Implementation of model compression with knowledge distilling method.☆342Jan 3, 2017Updated 9 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Nov 21, 2019Updated 6 years ago
- ResNeSt: Split-Attention Networks☆3,265Dec 9, 2022Updated 3 years ago
- FitNets: Hints for Thin Deep Nets☆211May 14, 2015Updated 10 years ago
- PyTorch DataLoaders implemented with DALI for accelerating image preprocessing☆884Jul 4, 2020Updated 5 years ago
- Network Slimming (Pytorch) (ICCV 2017)☆917Nov 6, 2020Updated 5 years ago
- Slimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019☆928Mar 9, 2023Updated 2 years ago
- A state-of-the-art semi-supervised method for image recognition☆1,654Oct 8, 2020Updated 5 years ago
- 95.47% on CIFAR10 with PyTorch☆6,358Feb 24, 2023Updated 2 years ago
- Official DeiT repository☆4,323Mar 15, 2024Updated last year