wangkai930418 / attndistill

code for our paper "Attention Distillation: self-supervised vision transformer students need more guidance" in BMVC 2022
14Updated last year

Related projects: