hunto / DIST_KDLinks
Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022
☆149Updated 2 years ago
Alternatives and similar repositories for DIST_KD
Users that are interested in DIST_KD are comparing it to the libraries listed below
Sorting:
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆100Updated 3 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆179Updated 10 months ago
- Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.☆84Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆109Updated 2 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆237Updated last year
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆131Updated last year
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆76Updated last year
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated last year
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆69Updated last year
- Official code for Scale Decoupled Distillation☆41Updated last year
- [ECCV 2022] Implementation of the paper "Locality Guidance for Improving Vision Transformers on Tiny Datasets"☆80Updated 3 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Updated 2 years ago
- (AAAI 2023 Oral) Pytorch implementation of "CF-ViT: A General Coarse-to-Fine Method for Vision Transformer"