Distilling Knowledge via Knowledge Review, CVPR 2021
☆278Dec 16, 2022Updated 3 years ago
Alternatives and similar repositories for ReviewKD
Users that are interested in ReviewKD are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆895Nov 5, 2023Updated 2 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Distilling Object Detectors with Feature Richness☆43Apr 15, 2022Updated 3 years ago
- Official implementation for "Knowledge Distillation with Refined Logits".☆22Aug 26, 2024Updated last year
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,655May 30, 2023Updated 2 years ago
- Masked Generative Distillation (ECCV 2022)☆241Nov 9, 2022Updated 3 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,601Dec 24, 2025Updated 2 months ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆155Dec 28, 2022Updated 3 years ago
- ☆114Apr 21, 2021Updated 4 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Jun 13, 2023Updated 2 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- Focal and Global Knowledge Distillation for Detectors (CVPR 2022)☆385Sep 19, 2022Updated 3 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,745Nov 25, 2021Updated 4 years ago
- ResRep: Lossless CNN Pruning via Decoupling Remembering and Forgetting (ICCV 2021)☆301Dec 1, 2022Updated 3 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆30Jul 5, 2023Updated 2 years ago
- IEEE Transactions on Intelligent Transportation Systems (2024)☆24Jul 22, 2025Updated 8 months ago
- ☆27Jun 28, 2022Updated 3 years ago
- ☆24May 6, 2022Updated 3 years ago
- Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)☆388Oct 24, 2024Updated last year
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Source code for the BMVC-2021 paper "SimReg: Regression as a Simple Yet Effective Tool for Self-supervised Knowledge Distillation".☆16Jan 20, 2022Updated 4 years ago
- ☆47Sep 9, 2021Updated 4 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- The official (TMLR) implementation of LumiNet: Perception-Driven Knowledge Distillation via Statistical Logit Calibration☆17Aug 17, 2025Updated 7 months ago
- Official implementations of CIRKD: Cross-Image Relational Knowledge Distillation for Semantic Segmentation and implementations on Citysca…☆212Aug 29, 2025Updated 6 months ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆76Nov 21, 2023Updated 2 years ago
- OpenMMLab Model Compression Toolbox and Benchmark.☆1,664Jun 11, 2024Updated last year
- ☆10Nov 2, 2023Updated 2 years ago
- [arXiv 2024] PyTorch implementation of RRD: https://arxiv.org/abs/2407.12073☆15Dec 2, 2025Updated 3 months ago
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆71Apr 14, 2023Updated 2 years ago
- The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other ta…☆743Apr 20, 2020Updated 5 years ago
- Official Implementation of the detection self-distillation framework LGD.☆53Apr 19, 2022Updated 3 years ago
- [ICLR'23] Trainability Preserving Neural Pruning (PyTorch)☆34May 21, 2023Updated 2 years ago
- ☆196Aug 27, 2021Updated 4 years ago
- The official implementation of ICLR2021 paper "Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and E…☆63Jun 16, 2021Updated 4 years ago
- Awesome Knowledge Distillation☆3,827Dec 25, 2025Updated 2 months ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago