aliyun / Revisiting-Knowledge-Distillation-an-Inheritance-and-Exploration-Framework
☆13Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for Revisiting-Knowledge-Distillation-an-Inheritance-and-Exploration-Framework
- ☆9Updated 3 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆96Updated 6 months ago
- Paraphrasing Complex Network: Network Compression via Factor Transfer Code (NeurIPS 2018)☆19Updated 4 years ago
- Official PyTorch implementation of PS-KD☆82Updated 2 years ago
- Awesome Knowledge-Distillation for CV☆71Updated 6 months ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆89Updated 2 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 — Carrying out CNN Channel Pruning in a White Box☆18Updated 2 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆115Updated 3 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆101Updated last year
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆74Updated 3 months ago
- Densely Guided Knowledge Distillation using Multiple Teacher Assistants☆8Updated 3 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated last year
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆41Updated 2 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆33Updated last year
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆57Updated 2 months ago
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆63Updated 3 years ago
- Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.☆61Updated 3 years ago
- ☆23Updated 11 months ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆72Updated 2 months ago
- ☆119Updated 4 years ago
- ☆25Updated last year
- ☆29Updated 4 years ago
- Switchable Online Knowledge Distillation☆17Updated 3 weeks ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆24Updated 4 years ago
- ☆46Updated 2 years ago
- ☆45Updated 3 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆19Updated last year
- [ICLR 2022] The Unreasonable Effectiveness of Random Pruning: Return of the Most Naive Baseline for Sparse Training by Shiwei Liu, Tianlo…☆73Updated last year
- A pytorch implement of scalable neural netowrks.☆23Updated 4 years ago
- This repo is the official megengine implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothin…☆25Updated 2 years ago