zhengli97 / CTKD
[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
☆173Updated 5 months ago
Alternatives and similar repositories for CTKD
Users that are interested in CTKD are comparing it to the libraries listed below
Sorting:
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆228Updated last year
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆146Updated 2 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆63Updated 7 months ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆106Updated 2 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆44Updated last year
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆94Updated 2 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆122Updated last year
- ☆85Updated last year
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆68Updated 2 years ago
- Official code for Scale Decoupled Distillation☆41Updated last year
- Official PyTorch implementation of PS-KD☆87Updated 2 years ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆75Updated last year
- ☆125Updated 4 years ago
- [CVPR'24] Official implementation of paper "FreeKD: Knowledge Distillation via Semantic Frequency Prompt".☆43Updated last year
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 9 months ago
- Masked Generative Distillation (ECCV 2022)☆222Updated 2 years ago
- CVPR2022, BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning, https://arxiv.org/abs/2203.01522☆250Updated 2 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆77Updated last month
- [CVPR-2024] Official implementations of CLIP-KD: An Empirical Study of CLIP Model Distillation☆112Updated 10 months ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.☆82Updated last year
- [CVPR-2022] Official implementations of CIRKD: Cross-Image Relational Knowledge Distillation for Semantic Segmentation and implementation…☆192Updated 10 months ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆188Updated last year
- Distilling Knowledge via Knowledge Review, CVPR 2021☆270Updated 2 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆853Updated last year
- Official implement of Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer☆72Updated 2 years ago
- [ICCV 23]An approach to enhance the efficiency of Vision Transformer (ViT) by concurrently employing token pruning and token merging tech…☆95Updated last year
- Official implementation for paper "Knowledge Diffusion for Distillation", NeurIPS 2023☆83Updated last year
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆98Updated last year
- ☆26Updated last year