/knowledge-distillation-for-unet

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Primary LanguagePython

Watchers

No one’s watching this repository yet.