megvii-research / mdistillerView external linksLinks
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
☆889Nov 5, 2023Updated 2 years ago
Alternatives and similar repositories for mdistiller
Users that are interested in mdistiller are comparing it to the libraries listed below
Sorting:
- Distilling Knowledge via Knowledge Review, CVPR 2021☆280Dec 16, 2022Updated 3 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆182Dec 3, 2024Updated last year
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆155Dec 28, 2022Updated 3 years ago
- Masked Generative Distillation (ECCV 2022)☆240Nov 9, 2022Updated 3 years ago
- Focal and Global Knowledge Distillation for Detectors (CVPR 2022)☆385Sep 19, 2022Updated 3 years ago
- Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)☆388Oct 24, 2024Updated last year
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Jun 13, 2023Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,742Nov 25, 2021Updated 4 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆102Jun 16, 2022Updated 3 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆242Oct 10, 2023Updated 2 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,590Dec 24, 2025Updated last month
- [CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation☆391Oct 9, 2024Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆110Nov 28, 2022Updated 3 years ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆76Nov 21, 2023Updated 2 years ago
- OpenMMLab Model Compression Toolbox and Benchmark.☆1,661Jun 11, 2024Updated last year
- Official code for Scale Decoupled Distillation☆77Apr 1, 2024Updated last year
- knowledge distillation papers☆767Feb 10, 2023Updated 3 years ago
- Official implementations of CIRKD: Cross-Image Relational Knowledge Distillation for Semantic Segmentation and implementations on Citysca…☆211Aug 29, 2025Updated 5 months ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆190Apr 29, 2024Updated last year
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆135Apr 19, 2024Updated last year
- ☆49Sep 24, 2022Updated 3 years ago
- Awesome Knowledge Distillation☆3,811Dec 25, 2025Updated last month
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- CrossKD: Cross-Head Knowledge Distillation for Dense Object Detection☆196Sep 24, 2023Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Feb 15, 2023Updated 3 years ago
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆71Apr 14, 2023Updated 2 years ago
- ☆114Apr 21, 2021Updated 4 years ago
- This repo is the official megengine implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothin…☆28Oct 19, 2022Updated 3 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆653Mar 1, 2023Updated 2 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Sep 18, 2023Updated 2 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆414May 17, 2021Updated 4 years ago
- TF-FD☆20Nov 19, 2022Updated 3 years ago
- Official PyTorch implementation of PS-KD☆89Aug 5, 2022Updated 3 years ago
- [CVPR 2023] DepGraph: Towards Any Structural Pruning; LLMs, Vision Foundation Models, etc.☆3,255Sep 7, 2025Updated 5 months ago
- Distilling Object Detectors with Feature Richness☆43Apr 15, 2022Updated 3 years ago