[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
☆180Dec 3, 2024Updated last year
Alternatives and similar repositories for CTKD
Users that are interested in CTKD are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [ICCV 2021] Official PyTorch Code for "Online Knowledge Distillation for Efficient Pose Estimation"☆42Mar 8, 2024Updated 2 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆899Nov 5, 2023Updated 2 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆70Sep 23, 2024Updated last year
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆102Jun 16, 2022Updated 3 years ago
- This repo holds the research projects of our lab.☆11Jan 20, 2024Updated 2 years ago
- Deploy on Railway without the complexity - Free Credits Offer • AdConnect your repo and Railway handles the rest with instant previews. Quickly provision container image services, databases, and storage volumes.
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆247Oct 10, 2023Updated 2 years ago
- Official PyTorch Code for "Dynamic Temperature Knowledge Distillation"☆11Mar 28, 2025Updated last year
- Masked Generative Distillation (ECCV 2022)☆242Nov 9, 2022Updated 3 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆156Dec 28, 2022Updated 3 years ago
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆71Apr 14, 2023Updated 3 years ago
- Fire Together Wire Together: A Dynamic Pruning Approach with Self-Supervised Mask Prediction☆10May 25, 2022Updated 3 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆85Mar 19, 2025Updated last year
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Jan 4, 2024Updated 2 years ago
- This repo holds the competitions (information, solutions, summaries, memories) that our team has participated in☆25Feb 4, 2024Updated 2 years ago
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,749Nov 25, 2021Updated 4 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated 2 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Sep 18, 2023Updated 2 years ago
- [PR 2024] Official PyTorch Code for "Dual Teachers for Self-Knowledge Distillation"☆13Nov 28, 2024Updated last year
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- Source code for the BMVC-2021 paper "SimReg: Regression as a Simple Yet Effective Tool for Self-supervised Knowledge Distillation".☆16Jan 20, 2022Updated 4 years ago
- SHAKE☆18Apr 14, 2023Updated 3 years ago
- Code for ICML 2023 paper "Quantifying the Knowledge in GNNs for Reliable Distillation into MLPs"☆22Sep 24, 2025Updated 7 months ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- [CVPR 2023] This repository includes the official implementation our paper "Masked Autoencoders Enable Efficient Knowledge Distillers"☆108Jul 24, 2023Updated 2 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- ☆16Nov 25, 2022Updated 3 years ago
- ☆34Aug 20, 2023Updated 2 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- ☆49Sep 24, 2022Updated 3 years ago
- This repository contains the official implementation for the series of NAIP family.☆51Jan 15, 2026Updated 3 months ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆277Dec 16, 2022Updated 3 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Complementary Relation Contrastive Distillation☆17Jun 29, 2021Updated 4 years ago
- ☆86Aug 31, 2023Updated 2 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,663May 30, 2023Updated 2 years ago
- ☆49Feb 18, 2025Updated last year
- This is a offical PyTorch/GPU implementation of SupMAE.☆80Aug 30, 2022Updated 3 years ago
- Reproducing VID in CVPR2019 (on working)☆20Nov 25, 2019Updated 6 years ago
- This is the official repo for the CVPR 2021 L2ID paper "Distill on the Go: Online knowledge distillation in self-supervised learning"☆12Nov 15, 2021Updated 4 years ago