DefangChen / SemCKDLinks
[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".
☆75Updated last year
Alternatives and similar repositories for SemCKD
Users that are interested in SemCKD are comparing it to the libraries listed below
Sorting:
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆99Updated 3 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆100Updated last year
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆118Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆49Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆148Updated 2 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆108Updated 2 years ago
- ☆127Updated 4 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆275Updated 2 years ago
- PyTorch implementation for Channel Distillation☆102Updated 5 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆166Updated 4 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆179Updated 3 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆41Updated 2 years ago
- Official PyTorch implementation of PS-KD☆89Updated 3 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆188Updated last year
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆108Updated 5 years ago
- ☆31Updated 5 years ago
- Awesome Knowledge-Distillation for CV☆88Updated last year
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆64Updated 3 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆33Updated last year
- ☆34Updated last year
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆29Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆176Updated 8 months ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated 2 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 4 years ago
- [NeurIPS 2020] Balanced Meta-Softmax for Long-Tailed Visual Recognition☆141Updated 3 years ago
- Code and pretrained models for paper: Data-Free Adversarial Distillation☆99Updated 2 years ago
- Official implement of Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer☆73Updated 3 years ago
- (CVPR 2021, Oral) Dynamic Slimmable Network☆229Updated 3 years ago
- HR-NAS: Searching Efficient High-Resolution Neural Architectures with Lightweight Transformers (CVPR21 Oral)☆142Updated 4 years ago