zhengli97 / CTKD
[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
☆163Updated 2 months ago
Alternatives and similar repositories for CTKD:
Users that are interested in CTKD are comparing it to the libraries listed below
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆224Updated last year
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆58Updated 4 months ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆92Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆140Updated 2 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆110Updated 9 months ago
- ☆85Updated last year
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆37Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆102Updated 2 years ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆74Updated last year
- Official code for Scale Decoupled Distillation☆38Updated 10 months ago
- Masked Generative Distillation (ECCV 2022)☆215Updated 2 years ago
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆68Updated last year
- [CVPR'24] Official implementation of paper "FreeKD: Knowledge Distillation via Semantic Frequency Prompt".☆37Updated 9 months ago
- ☆125Updated 4 years ago
- A curated list of awesome knowledge distillation papers and codes for object detection.☆126Updated 11 months ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆266Updated 2 years ago
- Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.☆82Updated 10 months ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆74Updated 2 months ago
- Official implementation for paper "Knowledge Diffusion for Distillation", NeurIPS 2023☆79Updated last year
- [ICCV 23]An approach to enhance the efficiency of Vision Transformer (ViT) by concurrently employing token pruning and token merging tech…☆93Updated last year
- Official PyTorch implementation of PS-KD☆83Updated 2 years ago
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆32Updated 2 years ago
- CVPR2022, BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning, https://arxiv.org/abs/2203.01522☆248Updated last year
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆184Updated 9 months ago
- [CVPR 2023] This repository includes the official implementation our paper "Masked Autoencoders Enable Efficient Knowledge Distillers"☆103Updated last year
- [CVPR-2024] Official implementations of CLIP-KD: An Empirical Study of CLIP Model Distillation☆96Updated 7 months ago
- [ECCV 2022] Implementation of the paper "Locality Guidance for Improving Vision Transformers on Tiny Datasets"☆77Updated 2 years ago
- [NeurIPS'22] Projector Ensemble Feature Distillation☆29Updated last year
- Codes for ECCV2022 paper - contrastive deep supervision☆68Updated 2 years ago
- CrossKD: Cross-Head Knowledge Distillation for Dense Object Detection☆149Updated last year