[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".
☆78Jul 29, 2024Updated last year
Alternatives and similar repositories for SemCKD
Users that are interested in SemCKD are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Distilling Knowledge via Knowledge Review, CVPR 2021☆277Dec 16, 2022Updated 3 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,662May 30, 2023Updated 2 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆85Mar 19, 2025Updated last year
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆900Nov 5, 2023Updated 2 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25May 25, 2020Updated 5 years ago
- Official implementation for "Knowledge Distillation with Refined Logits".☆23Aug 26, 2024Updated last year
- ICML2019 Accepted Paper. Overcoming Multi-Model Forgetting☆14Jun 5, 2019Updated 6 years ago
- ☆27Feb 6, 2021Updated 5 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,430Oct 16, 2023Updated 2 years ago
- Code for ICCV 2021 paper "Distilling Holistic Knowledge with Graph Neural Networks"☆44Dec 14, 2021Updated 4 years ago
- [IJCV 2022] Domain-Specific Bias Filtering for Single Labeled Domain Generalization☆12Nov 10, 2022Updated 3 years ago
- Codes for ECCV2020 paper "Improving Knowledge Distillation via Category Structure".☆10Mar 15, 2021Updated 5 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Reproducing VID in CVPR2019 (on working)☆20Nov 25, 2019Updated 6 years ago
- Masked Generative Distillation (ECCV 2022)☆242Nov 9, 2022Updated 3 years ago
- ☆114Apr 21, 2021Updated 5 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- (CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets☆83Oct 9, 2021Updated 4 years ago
- [IJCAI-2021&&TNNLS-2022] Official implementation of Hierarchical Self-supervised Augmented Knowledge Distillation☆78Mar 22, 2024Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆180Dec 3, 2024Updated last year
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆418May 17, 2021Updated 4 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- Official code for the SIGIR 2025 accepted paper "CDC: Causal Domain Clustering for Multi-Domain Recommendation".☆14Aug 27, 2025Updated 8 months ago
- Intra-class Feature Variation Distillation for Semantic Segmentation (ECCV 2020)☆71Sep 10, 2020Updated 5 years ago
- ☆27Dec 13, 2022Updated 3 years ago
- Experiments codes for WSDM '24 paper "MultiFS: Automated Multi-Scenario Feature Selection in Deep Recommender Systems"☆11May 31, 2024Updated last year
- ☆25May 20, 2020Updated 5 years ago
- Official Implementation of LRH-Net: A Multi-Level Knowledge Distillation Approach for Low-Resource Heart Network☆20Nov 11, 2023Updated 2 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆264Oct 3, 2019Updated 6 years ago
- CVPR2021☆12Mar 29, 2021Updated 5 years ago
- Open source password manager - Proton Pass • AdSecurely store, share, and autofill your credentials with Proton Pass, the end-to-end encrypted password manager trusted by millions.
- FitNets: Hints for Thin Deep Nets☆210May 14, 2015Updated 10 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆156Dec 28, 2022Updated 3 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated 2 years ago
- Model calibration in CLIP Adapters☆20Aug 19, 2024Updated last year
- A Pytorch implementation of the paper 'Bottom-Up and Top-Down Attention for Image Captioning and Visual Question Answering'☆10Jan 20, 2020Updated 6 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 4 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆167Oct 22, 2020Updated 5 years ago