megvii-research / mdistiller
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
☆853Updated last year
Alternatives and similar repositories for mdistiller:
Users that are interested in mdistiller are comparing it to the libraries listed below
- Distilling Knowledge via Knowledge Review, CVPR 2021☆270Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,686Updated 3 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,325Updated last year
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆586Updated 2 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆228Updated last year
- OpenMMLab Model Compression Toolbox and Benchmark.☆1,586Updated 10 months ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,580Updated last year
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆25 knowledge distillation methods p…☆1,499Updated this week
- [NeurIPS 2021] [T-PAMI] DynamicViT: Efficient Vision Transformers with Dynamic Token Sparsification☆602Updated last year
- CAIRI Supervised, Semi- and Self-Supervised Visual Representation Learning Toolbox and Benchmark☆650Updated 3 weeks ago
- Focal and Global Knowledge Distillation for Detectors (CVPR 2022)☆369Updated 2 years ago
- Masked Generative Distillation (ECCV 2022)☆221Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆173Updated 5 months ago
- This is a collection of our NAS and Vision Transformer work.☆1,751Updated 9 months ago
- This is a collection of our zero-cost NAS and efficient vision applications.☆417Updated last year
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆146Updated 2 years ago
- PyTorch implementation of MoCo v3 https//arxiv.org/abs/2104.02057☆1,264Updated 3 years ago
- knowledge distillation papers☆753Updated 2 years ago
- A codebase and a curated list of awesome deep long-tailed learning (TPAMI 2023).☆553Updated 5 months ago
- assistant tools for attention visualization in deep learning☆1,151Updated 2 years ago
- ☆125Updated 4 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,932Updated 2 years ago
- Efficient computing methods developed by Huawei Noah's Ark Lab☆1,265Updated 6 months ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆187Updated last year
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆417Updated 4 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆176Updated 3 years ago
- ☆427Updated 3 years ago
- United Perception☆432Updated 2 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆122Updated last year
- Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)☆381Updated 6 months ago