ifding / DLKD
Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/document/9830618
☆12Updated 2 years ago
Alternatives and similar repositories for DLKD:
Users that are interested in DLKD are comparing it to the libraries listed below
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- Code release for "Dynamic Domain Adaptation for Efficient Inference" (CVPR 2021)☆33Updated 2 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆19Updated last year
- Switchable Online Knowledge Distillation☆18Updated 2 months ago
- TF-FD☆19Updated 2 years ago
- Distilling knowledge from ensemble of multiple teacher networks to student network with multiple heads☆7Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Updated 4 years ago
- [ICASSP-2021] Official implementations of Multi-View Contrastive Learning for Online Knowledge Distillation (MCL-OKD)☆26Updated 3 years ago
- ☆21Updated 3 years ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆23Updated last year
- Local Context-Aware Active Domain Adaptation (ICCV 2023)