ifding / DLKD
Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/document/9830618
☆11Updated last year
Related projects ⓘ
Alternatives and complementary repositories for DLKD
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆23Updated last year
- official source code for the Paper: **Long-tailed Visual Recognition via Gaussian Clouded Logit Adjustment** based on Pytorch.☆36Updated 4 months ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆33Updated last year
- Code release for "Dynamic Domain Adaptation for Efficient Inference" (CVPR 2021)☆33Updated 2 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆24Updated 4 years ago
- Switchable Online Knowledge Distillation☆16Updated 2 weeks ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆24Updated 4 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆95Updated 6 months ago
- Black-box Few-shot Knowledge Distillation☆11Updated 2 years ago
- The Official Code of CPR (ICLR 2021)☆13Updated 3 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆54Updated last month
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆99Updated last year
- Official Implementation of the ECCV 2022 Paper "Class-Incremental Learning with Cross-Space Clustering and Controlled Transfer"☆17Updated 2 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆19Updated last year
- ☆9Updated 3 years ago
- [CVPR 2023] Regularizing Second-Order Influences for Continual Learning☆33Updated last year
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆89Updated 2 years ago
- AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation☆48Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆74Updated 3 months ago
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆29Updated 2 years ago
- Code release for "Cycle Self-Training for Domain Adaptation" (NeurIPS 2021)☆51Updated 2 years ago
- ISD: Self-Supervised Learning by Iterative Similarity Distillation☆36Updated 3 years ago
- ☆23Updated 11 months ago
- Code for "Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification", ECCV 2020 Spotlight☆36Updated 3 years ago
- ☆22Updated 3 years ago
- Official code for CVPR 2022 paper "Relieving Long-tailed Instance Segmentation via Pairwise Class Balance".☆37Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆41Updated 2 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆31Updated 4 years ago
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆63Updated 3 years ago