WangYZ1608 / Knowledge-Distillation-via-NDLinks
The official implementation for paper: Improving Knowledge Distillation via Regularizing Feature Norm and Direction
☆23Updated 2 years ago
Alternatives and similar repositories for Knowledge-Distillation-via-ND
Users that are interested in Knowledge-Distillation-via-ND are comparing it to the libraries listed below
Sorting:
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆238Updated 2 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆100Updated 3 years ago
- ☆87Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆152Updated 2 years ago
- [CVPR 2023 Highlight] This is the official implementation of "Stitchable Neural Networks".☆249Updated 2 years ago
- Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.