Jin-Ying / Multi-Level-Logit-DistillationLinks
Code for 'Multi-level Logit Distillation' (CVPR2023)
☆66Updated 9 months ago
Alternatives and similar repositories for Multi-Level-Logit-Distillation
Users that are interested in Multi-Level-Logit-Distillation are comparing it to the libraries listed below
Sorting:
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆147Updated 2 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆44Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆176Updated 7 months ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆98Updated 3 years ago
- Official code for Scale Decoupled Distillation☆42Updated last year
- ☆26Updated last year
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆128Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆108Updated 2 years ago
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆32Updated 2 years ago
- The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://ar…☆42Updated 7 months ago
- Official PyTorch implementation of PS-KD