[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
☆180Dec 3, 2024Updated last year
Alternatives and similar repositories for CTKD
Users that are interested in CTKD are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [ICCV 2021] Official PyTorch Code for "Online Knowledge Distillation for Efficient Pose Estimation"☆43Mar 8, 2024Updated 2 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆900Nov 5, 2023Updated 2 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆71Sep 23, 2024Updated last year
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆247Oct 10, 2023Updated 2 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- Official PyTorch Code for "Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?" (https://arxiv.org/abs/2305.12954)☆48Dec 3, 2023Updated 2 years ago
- Masked Generative Distillation (ECCV 2022)☆241Nov 9, 2022Updated 3 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆156Dec 28, 2022Updated 3 years ago
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆71Apr 14, 2023Updated 3 years ago
- Official Implementation of Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection☆51Oct 7, 2023Updated 2 years ago
- Fire Together Wire Together: A Dynamic Pruning Approach with Self-Supervised Mask Prediction☆10May 25, 2022Updated 3 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆85Mar 19, 2025Updated last year
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆138Apr 19, 2024Updated last year
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Jan 4, 2024Updated 2 years ago
- GPUs on demand by Runpod - Special Offer Available • AdRun AI, ML, and HPC workloads on powerful cloud GPUs—without limits or wasted spend. Deploy GPUs in under a minute and pay by the second.
- This repo holds the competitions (information, solutions, summaries, memories) that our team has participated in☆25Feb 4, 2024Updated 2 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆77Nov 21, 2023Updated 2 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,427Oct 16, 2023Updated 2 years ago
- [PR 2024] Official PyTorch Code for "Dual Teachers for Self-Knowledge Distillation"☆13Nov 28, 2024Updated last year
- Source code for the BMVC-2021 paper "SimReg: Regression as a Simple Yet Effective Tool for Self-supervised Knowledge Distillation".☆16Jan 20, 2022Updated 4 years ago
- SHAKE☆18Apr 14, 2023Updated 3 years ago
- Code for ICML 2023 paper "Quantifying the Knowledge in GNNs for Reliable Distillation into MLPs"☆22Sep 24, 2025Updated 6 months ago
- knowledge distillation papers☆765Feb 10, 2023Updated 3 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- [CVPR 2023] This repository includes the official implementation our paper "Masked Autoencoders Enable Efficient Knowledge Distillers"☆108Jul 24, 2023Updated 2 years ago
- ☆16Nov 25, 2022Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- ☆34Aug 20, 2023Updated 2 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- ☆49Sep 24, 2022Updated 3 years ago
- This repository contains the official implementation for the series of NAIP family.☆51Jan 15, 2026Updated 3 months ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago
- Complementary Relation Contrastive Distillation☆17Jun 29, 2021Updated 4 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- Distilling Knowledge via Knowledge Review, CVPR 2021☆278Dec 16, 2022Updated 3 years ago
- ☆48Feb 18, 2025Updated last year
- This is a offical PyTorch/GPU implementation of SupMAE.☆79Aug 30, 2022Updated 3 years ago
- Reproducing VID in CVPR2019 (on working)☆20Nov 25, 2019Updated 6 years ago
- ☆11Nov 21, 2022Updated 3 years ago
- This is the official repo for the CVPR 2021 L2ID paper "Distill on the Go: Online knowledge distillation in self-supervised learning"☆12Nov 15, 2021Updated 4 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Jun 13, 2023Updated 2 years ago