xeanzheng / CSKD
Codes for ECCV2020 paper "Improving Knowledge Distillation via Category Structure".
☆10Updated 4 years ago
Alternatives and similar repositories for CSKD:
Users that are interested in CSKD are comparing it to the libraries listed below
- Graph Knowledge Distillation☆13Updated 5 years ago
- Distilling Knowledge via Intermediate Classifiers☆15Updated 3 years ago
- ☆26Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆73Updated last year
- Feature Fusion for Online Mutual Knowledge Distillation Code☆25Updated 4 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated 2 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆24Updated 4 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆40Updated 2 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆33Updated 8 months ago
- ☆9Updated 3 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆31Updated 5 years ago
- [ICASSP 2020] Code release of paper 'Heterogeneous Domain Generalization via Domain Mixup'☆25Updated 4 years ago
- ☆33Updated last year
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆40Updated 2 years ago
- ZSKD with PyTorch☆30Updated last year
- Switchable Online Knowledge Distillation☆18Updated 5 months ago
- Code Release for "Self-supervised Learning is More Robust to Dataset Imbalance"☆38Updated 3 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆15Updated 4 years ago
- TF-FD☆20Updated 2 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆74Updated 7 months ago
- Source code for 'Knowledge Distillation via Instance Relationship Graph'☆30Updated 5 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- Information Bottleneck Approach to Spatial Attention Learning, IJCAI2021☆15Updated 3 years ago
- Auto-Prox-AAAI24☆12Updated 10 months ago
- ☆25Updated 4 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Updated 2 years ago
- The official implementation of "DOTS: Decoupling Operation and Topology in Differentiable Architecture Search"☆19Updated 3 years ago
- An efficient implementation for ImageNet classification☆17Updated 4 years ago