asaander719 / PSPNet-knowledge-distillation

This repo uses a combination of logits and feature distillation method to teach the PSPNet model of ResNet18 backbone with the PSPNet model of ResNet50 backbone. All the models are trained and tested on the PASCAL-VOC2012 dataset.
11Updated 3 years ago

Alternatives and similar repositories for PSPNet-knowledge-distillation:

Users that are interested in PSPNet-knowledge-distillation are comparing it to the libraries listed below