Jin-Ying / Multi-Level-Logit-Distillation
Code for 'Multi-level Logit Distillation' (CVPR2023)
☆60Updated 6 months ago
Alternatives and similar repositories for Multi-Level-Logit-Distillation:
Users that are interested in Multi-Level-Logit-Distillation are comparing it to the libraries listed below
- Official code for Scale Decoupled Distillation☆40Updated 11 months ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆41Updated last year
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆167Updated 3 months ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆141Updated 2 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆92Updated 2 years ago
- ☆25Updated last year
- Official PyTorch implementation of PS-KD☆84Updated 2 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆104Updated 2 years ago
- The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://ar…☆31Updated 3 months ago
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆32Updated 2 years ago
- Official implementation for paper "Knowledge Diffusion for Distillation", NeurIPS 2023☆81Updated last year
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆117Updated 11 months ago
- Official code for Cumulative Spatial Knowledge Distillation for Vision Transformers (ICCV-2023) https://openaccess.thecvf.com/content/ICC…☆15Updated last year
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆75Updated last year
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆23Updated last year
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆76Updated last week
- Official PyTorch(MMCV) implementation of “Adversarial AutoMixup” (ICLR 2024 spotlight)☆65Updated 4 months ago
- Code for ICML 2024 paper (Oral) — Test-Time Model Adaptation with Only Forward Passes☆70Updated 7 months ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Updated 2 years ago
- [ICCV 2023 & AAAI 2023] Binary Adapters & FacT, [Tech report] Convpass☆179Updated last year
- The offical implement of ImbSAM (Imbalanced-SAM)☆23Updated last year
- ☆85Updated last year
- [ICASSP-2021] Official implementations of Multi-View Contrastive Learning for Online Knowledge Distillation (MCL-OKD)☆26Updated 3 years ago
- Switchable Online Knowledge Distillation☆18Updated 5 months ago
- [ICCV 23]An approach to enhance the efficiency of Vision Transformer (ViT) by concurrently employing token pruning and token merging tech…☆94Updated last year
- Code for CVPR 2024 paper - Resurrecting Old Classes with New Data for Exemplar-Free Continual Learning☆24Updated 4 months ago
- Official code of "Generating Instance-level Prompts for Rehearsal-free Continual Learning (ICCV 2023)"☆42Updated last year
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆71Updated 2 years ago
- [ICCV 2023 oral] This is the official repository for our paper: ''Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning''.☆66Updated last year
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆28Updated last year