i13abe / Triplet-Loss-for-Knowledge-Distillation
Triplet Loss for Knowledge Distillation
☆17Updated 2 years ago
Alternatives and similar repositories for Triplet-Loss-for-Knowledge-Distillation:
Users that are interested in Triplet-Loss-for-Knowledge-Distillation are comparing it to the libraries listed below
- Official Code of Paper HoMM: Higher-order Moment Matching for Unsupervised Domain Adaptation (AAAI2020)☆43Updated 5 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Updated 4 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆24Updated 4 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification