/knowledge-distillation-for-unet

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Primary LanguagePython

Knowledge Distillation for UNet

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Results:

Dataset: Carvana Image Masking Challenge

Models trained without knowledge distillation

Models trained without knowledge distillation

Models trained with knowledge distillation

Models trained with knowledge distillation

References