DefangChen / SemCKDLinks
[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".
☆75Updated 10 months ago
Alternatives and similar repositories for SemCKD
Users that are interested in SemCKD are comparing it to the libraries listed below
Sorting:
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆97Updated 3 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆100Updated last year
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆49Updated 2 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆108Updated 2 years ago
- PyTorch implementation for Channel Distillation☆101Updated 5 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆40Updated 2 years ago
- ☆126Updated 4 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆166Updated 4 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆272Updated 2 years ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆188Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- Official PyTorch implementation of PS-KD☆88Updated 2 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Updated 3 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆146Updated 2 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- ☆31Updated 5 years ago
- PyTorch Implementation of Matching Guided Distillation [ECCV 2020]☆64Updated 3 years ago
- ☆46Updated 3 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 4 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆108Updated 5 years ago
- ☆47Updated 2 years ago
- Paraphrasing Complex Network: Network Compression via Factor Transfer Code (NeurIPS 2018)☆20Updated 4 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 — Carrying out CNN Channel Pruning in a White Box☆18Updated 3 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆78Updated 3 months ago
- Code and pretrained models for paper: Data-Free Adversarial Distillation☆99Updated 2 years ago
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆64Updated 3 years ago
- [NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data☆45Updated 2 years ago
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆72Updated 3 years ago
- ☆24Updated 3 years ago