SforAiDl / KD_LibView external linksLinks
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
☆653Mar 1, 2023Updated 2 years ago
Alternatives and similar repositories for KD_Lib
Users that are interested in KD_Lib are comparing it to the libraries listed below
Sorting:
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,741Nov 25, 2021Updated 4 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,590Dec 24, 2025Updated last month
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,981Mar 25, 2023Updated 2 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Feb 15, 2023Updated 2 years ago
- Awesome Knowledge Distillation☆3,811Dec 25, 2025Updated last month
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- knowledge distillation papers☆767Feb 10, 2023Updated 3 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆889Nov 5, 2023Updated 2 years ago
- OpenMMLab Model Compression Toolbox and Benchmark.☆1,662Jun 11, 2024Updated last year
- A large scale study of Knowledge Distillation.☆220Apr 19, 2020Updated 5 years ago
- MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.☆701Dec 24, 2021Updated 4 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- A curated list of neural network pruning resources.☆2,490Apr 4, 2024Updated last year
- PyTorch implementation for Channel Distillation☆103Jun 9, 2020Updated 5 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- ☆34Aug 20, 2023Updated 2 years ago
- Mobile vision models and code☆916Updated this week
- [ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment☆1,939Dec 14, 2023Updated 2 years ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆36,351Updated this week
- torch-optimizer -- collection of optimizers for Pytorch☆3,161Mar 22, 2024Updated last year
- [CVPR 2023] DepGraph: Towards Any Structural Pruning; LLMs, Vision Foundation Models, etc.☆3,255Sep 7, 2025Updated 5 months ago
- micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantiz…☆2,271May 6, 2025Updated 9 months ago
- Revisiting Parameter Sharing for Automatic Neural Channel Number Search, NeurIPS 2020☆22Nov 15, 2020Updated 5 years ago
- (ECCV'2020 Oral)EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning☆309Dec 8, 2022Updated 3 years ago
- When Does Label Smoothing Help?_pytorch_implementationimp☆126Jan 13, 2020Updated 6 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Sep 18, 2023Updated 2 years ago
- Awesome Knowledge-Distillation for CV☆92Apr 30, 2024Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆238Dec 15, 2022Updated 3 years ago
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆19Oct 12, 2021Updated 4 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆280Dec 16, 2022Updated 3 years ago
- Efficient computing methods developed by Huawei Noah's Ark Lab☆1,309Nov 5, 2024Updated last year
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆155Dec 28, 2022Updated 3 years ago
- Summary, Code for Deep Neural Network Quantization☆558Jun 14, 2025Updated 7 months ago
- ☆58Jun 18, 2021Updated 4 years ago
- Code for our paper at ECCV 2020: Post-Training Piecewise Linear Quantization for Deep Neural Networks☆68Nov 4, 2021Updated 4 years ago
- Post-training sparsity-aware quantization☆34Feb 26, 2023Updated 2 years ago
- ☆27Feb 6, 2021Updated 5 years ago