SJLeo / FFSDLinks
Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation
☆30Updated 3 years ago
Alternatives and similar repositories for FFSD
Users that are interested in FFSD are comparing it to the libraries listed below
Sorting:
- ☆57Updated 4 years ago
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆64Updated 4 years ago
- ☆27Updated 2 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆32Updated 5 years ago
- ☆20Updated 2 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆49Updated 2 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25Updated 5 years ago
- Code for Active Mixup in 2020 CVPR☆23Updated 3 years ago
- Revisiting Parameter Sharing for Automatic Neural Channel Number Search, NeurIPS 2020☆21Updated 4 years ago
- [NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang…☆89Updated last year
- ☆22Updated 5 years ago
- Implementation "Adapting Auxiliary Losses Using Gradient Similarity" article☆32Updated 6 years ago
- Paper and Code for "Curriculum Learning by Optimizing Learning Dynamics" (AISTATS 2021)☆19Updated 4 years ago
- ICML'20: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust☆16Updated 4 years ago
- [ICLR 2021] Heteroskedastic and Imbalanced Deep Learning with Adaptive Regularization☆41Updated 4 years ago
- Neuron Merging: Compensating for Pruned Neurons (NeurIPS 2020)☆43Updated 4 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆41Updated 2 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆75Updated 2 years ago
- ☆27Updated 4 years ago
- Implementation of PGONAS for CVPR22W and RD-NAS for ICASSP23☆22Updated 2 years ago
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆42Updated 2 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆34Updated last year
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 4 years ago
- This repo is the official megengine implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothin…☆26Updated 2 years ago
- Code for ViTAS_Vision Transformer Architecture Search☆50Updated 4 years ago
- Codes for DATA: Differentiable ArchiTecture Approximation.☆11Updated 4 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆82Updated 3 years ago
- Auto-Prox-AAAI24☆13Updated last year
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 5 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Updated last year