Jin-Ying / Multi-Level-Logit-DistillationLinks
Code for 'Multi-level Logit Distillation' (CVPR2023)
☆67Updated 10 months ago
Alternatives and similar repositories for Multi-Level-Logit-Distillation
Users that are interested in Multi-Level-Logit-Distillation are comparing it to the libraries listed below
Sorting:
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆176Updated 8 months ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆148Updated 2 years ago
- Official code for Scale Decoupled Distillation☆41Updated last year
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆99Updated 3 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆44Updated 2 years ago
- ☆26Updated last year
- The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://ar…☆42Updated 7 months ago
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆32Updated 2 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆108Updated 2 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆129Updated last year
- Official PyTorch implementation of PS-KD