[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".
☆78Jul 29, 2024Updated last year
Alternatives and similar repositories for SemCKD
Users that are interested in SemCKD are comparing it to the libraries listed below
Sorting:
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆279Dec 16, 2022Updated 3 years ago
- ☆47Sep 9, 2021Updated 4 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,655May 30, 2023Updated 2 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆895Nov 5, 2023Updated 2 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25May 25, 2020Updated 5 years ago
- Official implementation for "Knowledge Distillation with Refined Logits".☆22Aug 26, 2024Updated last year
- ICML2019 Accepted Paper. Overcoming Multi-Model Forgetting☆14Jun 5, 2019Updated 6 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- Code for ICCV 2021 paper "Distilling Holistic Knowledge with Graph Neural Networks"☆44Dec 14, 2021Updated 4 years ago
- [IJCV 2022] Domain-Specific Bias Filtering for Single Labeled Domain Generalization☆12Nov 10, 2022Updated 3 years ago
- Codes for ECCV2020 paper "Improving Knowledge Distillation via Category Structure".☆10Mar 15, 2021Updated 5 years ago
- Masked Generative Distillation (ECCV 2022)☆241Nov 9, 2022Updated 3 years ago
- Reproducing VID in CVPR2019 (on working)☆20Nov 25, 2019Updated 6 years ago
- ☆114Apr 21, 2021Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- Un-offical PyTorch Implementation of "Class-Balanced Distillation for Long-Tailed Visual Recognition" paper.☆17Oct 31, 2021Updated 4 years ago
- (CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets☆83Oct 9, 2021Updated 4 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆414May 17, 2021Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- Intra-class Feature Variation Distillation for Semantic Segmentation (ECCV 2020)☆72Sep 10, 2020Updated 5 years ago
- Experiments codes for WSDM '24 paper "MultiFS: Automated Multi-Scenario Feature Selection in Deep Recommender Systems"☆11May 31, 2024Updated last year
- ☆27Dec 13, 2022Updated 3 years ago
- ☆25May 20, 2020Updated 5 years ago
- Official Implementation of LRH-Net: A Multi-Level Knowledge Distillation Approach for Low-Resource Heart Network☆20Nov 11, 2023Updated 2 years ago
- This is the implementation for the ICASSP-2022 paper (Confidence-Aware Multi-Teacher Knowledge Distillation).☆63Feb 12, 2022Updated 4 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆264Oct 3, 2019Updated 6 years ago
- CVPR2021☆12Mar 29, 2021Updated 4 years ago
- pytorch implementation of "Differentiable Soft Quantization: Bridging Full-Precision and Low-Bit Neural Networks"☆128Jan 2, 2020Updated 6 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆155Dec 28, 2022Updated 3 years ago
- Model calibration in CLIP Adapters☆20Aug 19, 2024Updated last year
- Official code and dataset for our NAACL 2024 paper: DialogCC: An Automated Pipeline for Creating High-Quality Multi-modal Dialogue Datase…☆13Jun 24, 2024Updated last year
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 3 years ago
- ☆12Jul 30, 2019Updated 6 years ago
- SHAKE☆18Apr 14, 2023Updated 2 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,601Dec 24, 2025Updated 2 months ago