VaticanCameos99 / knowledge-distillation-for-unet

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.
52Updated 4 years ago

Alternatives and similar repositories for knowledge-distillation-for-unet:

Users that are interested in knowledge-distillation-for-unet are comparing it to the libraries listed below