Jin-Ying / Multi-Level-Logit-DistillationLinks
Code for 'Multi-level Logit Distillation' (CVPR2023)
☆68Updated 11 months ago
Alternatives and similar repositories for Multi-Level-Logit-Distillation
Users that are interested in Multi-Level-Logit-Distillation are comparing it to the libraries listed below
Sorting:
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆45Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆178Updated 8 months ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆100Updated 3 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆148Updated 2 years ago
- Official code for Scale Decoupled Distillation☆41Updated last year
- ☆26Updated last year
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆130Updated last year
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆32Updated 2 years ago
- Official PyTorch implementation of PS-KD☆89Updated 3 years ago
- The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://ar…☆42Updated 8 months ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…