SJLeo / FFSD
Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation
☆28Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for FFSD
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆31Updated 3 months ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆31Updated 4 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆24Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated last year
- ☆56Updated 3 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆24Updated 4 years ago
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆63Updated 3 years ago
- This repo is the official megengine implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothin…☆25Updated 2 years ago
- ☆26Updated last year
- ☆26Updated 3 years ago
- TF-FD☆20Updated 2 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆19Updated last year
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆41Updated 2 years ago
- Code for Active Mixup in 2020 CVPR☆22Updated 2 years ago
- Black-box Few-shot Knowledge Distillation☆11Updated 2 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆72Updated last year
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆23Updated last year
- Neuron Merging: Compensating for Pruned Neurons (NeurIPS 2020)☆41Updated 3 years ago
- Paper and Code for "Curriculum Learning by Optimizing Learning Dynamics" (AISTATS 2021)☆19Updated 3 years ago
- Code for "Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification", ECCV 2020 Spotlight☆37Updated 3 years ago
- ☆33Updated last year
- ☆19Updated last year
- ☆20Updated 4 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆30Updated 4 years ago
- PyTorch code for the paper "CrossTransformers: spatially-aware few-shot transfer"☆22Updated 3 years ago
- Switchable Online Knowledge Distillation☆16Updated 3 weeks ago
- Reproducing VID in CVPR2019 (on working)☆20Updated 4 years ago
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆39Updated last year
- ☆9Updated 3 years ago
- ☆19Updated 4 years ago