VaticanCameos99 / knowledge-distillation-for-unetView on GitHub
An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.
54May 7, 2020Updated 5 years ago

Alternatives and similar repositories for knowledge-distillation-for-unet

Users that are interested in knowledge-distillation-for-unet are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?