megvii-research / mdistiller
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
☆807Updated last year
Related projects ⓘ
Alternatives and complementary repositories for mdistiller
- Distilling Knowledge via Knowledge Review, CVPR 2021☆261Updated last year
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,614Updated 2 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆217Updated last year
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,197Updated last year
- CAIRI Supervised, Semi- and Self-Supervised Visual Representation Learning Toolbox and Benchmark☆629Updated 3 weeks ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆580Updated last year
- OpenMMLab Model Compression Toolbox and Benchmark.☆1,479Updated 5 months ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,497Updated last year
- [NeurIPS 2021] [T-PAMI] DynamicViT: Efficient Vision Transformers with Dynamic Token Sparsification☆572Updated last year
- Masked Generative Distillation (ECCV 2022)☆214Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆158Updated 4 months ago
- A codebase and a curated list of awesome deep long-tailed learning (TPAMI 2023).☆502Updated last year
- Focal and Global Knowledge Distillation for Detectors (CVPR 2022)☆351Updated 2 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆97Updated 7 months ago
- ☆411Updated 2 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, I…☆1,392Updated last month
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆138Updated last year
- ❄️🔥 Visual Prompt Tuning [ECCV 2022] https://arxiv.org/abs/2203.12119☆1,043Updated last year
- knowledge distillation papers☆741Updated last year
- Pytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"☆425Updated last year
- Scaling Up Your Kernels to 31x31: Revisiting Large Kernel Design in CNNs (CVPR 2022)☆870Updated 6 months ago
- Efficient computing methods developed by Huawei Noah's Ark Lab☆1,203Updated 2 weeks ago
- United Perception☆430Updated last year
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,864Updated last year
- assistant tools for attention visualization in deep learning☆1,009Updated 2 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆115Updated 3 years ago
- This is a collection of our NAS and Vision Transformer work.☆1,689Updated 3 months ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆181Updated 6 months ago
- SimpleAICV:pytorch training and testing examples.☆419Updated this week
- ☆119Updated 4 years ago