zhengli97 / CTKDView external linksLinks
[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
☆182Dec 3, 2024Updated last year
Alternatives and similar repositories for CTKD
Users that are interested in CTKD are comparing it to the libraries listed below
Sorting:
- [ICCV 2021] Official PyTorch Code for "Online Knowledge Distillation for Efficient Pose Estimation"☆43Mar 8, 2024Updated last year
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆889Nov 5, 2023Updated 2 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆102Jun 16, 2022Updated 3 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆71Sep 23, 2024Updated last year
- This repo holds the research projects of our lab.☆12Jan 20, 2024Updated 2 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆243Oct 10, 2023Updated 2 years ago
- [CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation☆391Oct 9, 2024Updated last year
- Official Codes and Pretrained Models for RecursiveMix☆22Apr 24, 2023Updated 2 years ago
- Official PyTorch Code for "Dynamic Temperature Knowledge Distillation"☆11Mar 28, 2025Updated 10 months ago
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆71Apr 14, 2023Updated 2 years ago
- Masked Generative Distillation (ECCV 2022)☆240Nov 9, 2022Updated 3 years ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆76Nov 21, 2023Updated 2 years ago
- Official PyTorch Code for "Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?" (https://arxiv.org/abs/2305.12954)☆49Dec 3, 2023Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆155Dec 28, 2022Updated 3 years ago
- This repo holds the competitions (information, solutions, summaries, memories) that our team has participated in☆26Feb 4, 2024Updated 2 years ago
- Official Implementation of Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection☆50Oct 7, 2023Updated 2 years ago
- [PR 2024] Official PyTorch Code for "Dual Teachers for Self-Knowledge Distillation"☆13Nov 28, 2024Updated last year
- Source code for the BMVC-2021 paper "SimReg: Regression as a Simple Yet Effective Tool for Self-supervised Knowledge Distillation".☆16Jan 20, 2022Updated 4 years ago
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Jan 4, 2024Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,742Nov 25, 2021Updated 4 years ago
- Official Codes for "Uniform Masking: Enabling MAE Pre-training for Pyramid-based Vision Transformers with Locality"☆245Dec 3, 2022Updated 3 years ago
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆135Apr 19, 2024Updated last year
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- [CVPR 2023] This repository includes the official implementation our paper "Masked Autoencoders Enable Efficient Knowledge Distillers"☆109Jul 24, 2023Updated 2 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- ☆49Feb 18, 2025Updated 11 months ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆82Mar 19, 2025Updated 10 months ago
- This repository contains the official implementation for the AAAI25 paper "From Words to Worth: Newborn Article Impact Prediction with LL…☆52Jan 15, 2026Updated last month
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- ☆49Sep 24, 2022Updated 3 years ago
- ☆27Jun 20, 2021Updated 4 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Sep 18, 2023Updated 2 years ago
- [ACM MM'23] Official implementation of paper "Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty".☆14Nov 22, 2023Updated 2 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- This is the official repo for the CVPR 2021 L2ID paper "Distill on the Go: Online knowledge distillation in self-supervised learning"☆12Nov 15, 2021Updated 4 years ago
- Official Pytorch implementation of "E2VPT: An Effective and Efficient Approach for Visual Prompt Tuning". (ICCV2023)☆72Jan 19, 2024Updated 2 years ago
- knowledge distillation papers☆767Feb 10, 2023Updated 3 years ago
- TF-FD☆20Nov 19, 2022Updated 3 years ago