lliai / Teacher-free-Distillation
TF-FD
☆19Updated 2 years ago
Alternatives and similar repositories for Teacher-free-Distillation:
Users that are interested in Teacher-free-Distillation are comparing it to the libraries listed below
- Official Codes and Pretrained Models for RecursiveMix☆22Updated last year
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Updated 4 years ago
- [CVPR 2022] "The Principle of Diversity: Training Stronger Vision Transformers Calls for Reducing All Levels of Redundancy" by Tianlong C…☆25Updated 2 years ago
- Prior Knowledge Guided Unsupervised Domain Adaptation (ECCV 2022)☆17Updated 2 years ago
- Implementation of the paper ''Implicit Feature Refinement for Instance Segmentation''.☆20Updated 3 years ago