megvii-research / mdistillerLinks
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
☆885Updated 2 years ago
Alternatives and similar repositories for mdistiller
Users that are interested in mdistiller are comparing it to the libraries listed below
Sorting:
- Distilling Knowledge via Knowledge Review, CVPR 2021☆278Updated 3 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,738Updated 4 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,421Updated 2 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆241Updated 2 years ago
- OpenMMLab Model Compression Toolbox and Benchmark.☆1,655Updated last year
- Efficient computing methods developed by Huawei Noah's Ark Lab☆1,307Updated last year
- CAIRI Supervised, Semi- and Self-Supervised Visual Representation Learning Toolbox and Benchmark☆658Updated 2 months ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Updated 2 years ago
- This is a collection of our zero-cost NAS and efficient vision applications.☆448Updated 2 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,647Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆181Updated last year
- Focal and Global Knowledge Distillation for Detectors (CVPR 2022)☆383Updated 3 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,584Updated 2 weeks ago
- [NeurIPS 2021] [T-PAMI] DynamicViT: Efficient Vision Transformers with Dynamic Token Sparsification☆643Updated 2 years ago
- knowledge distillation papers☆766Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆154Updated 3 years ago
- ☆430Updated 3 years ago
- Masked Generative Distillation (ECCV 2022)☆237Updated 3 years ago
- This is a collection of our NAS and Vision Transformer work.☆1,820Updated last year
- ☆128Updated 5 years ago
- SimpleAICV:pytorch training and testing examples.☆437Updated 2 months ago
- The implementation of various lightweight networks by using PyTorch. such as:MobileNetV2,MobileNeXt,GhostNet,ParNet,MobileViT、AdderNet,Sh…☆906Updated 3 years ago
- Pytorch reproduction of Peer Collaborative Learning for Online Knowledge Distillation, AAAI2021☆21Updated 3 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,977Updated 2 years ago
- The official PyTorch implementation of CHEX: CHannel EXploration for CNN Model Compression (CVPR 2022). Paper is available at https://ope…☆38Updated 3 years ago
- Scaling Up Your Kernels to 31x31: Revisiting Large Kernel Design in CNNs (CVPR 2022)☆936Updated last year
- Pytorch!!!Pytorch!!!Pytorch!!! Dynamic Convolution: Attention over Convolution Kernels (CVPR-2020)☆594Updated 3 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆135Updated last year
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆118Updated 4 years ago
- This is the official PyTorch implementation of the paper "TransFG: A Transformer Architecture for Fine-grained Recognition" (Ju He, Jie-N…☆414Updated 3 years ago