JinYu1998 / DTKDLinks
Official PyTorch Code for "Dynamic Temperature Knowledge Distillation"
☆11Updated 10 months ago
Alternatives and similar repositories for DTKD
Users that are interested in DTKD are comparing it to the libraries listed below
Sorting:
- The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://ar…☆49Updated last year
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆71Updated last year
- Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.