ethliup/SelfDeblur

Test on real pictures, input color picture , but get black and white picture, is it ok?

Closed this issue · 8 comments

Test on real pictures, input color picture , but get black and white picture, is it ok?

Not sure if I understand your question correctly. If you want to deblur color image but get black and white picture, you can simply do a post-processing to convert the debblurred image to gray scale image.

It seems that the run demo.sh for /real can only output gray pictures even the inputs are color pictures.

Okay, if you use the demo pictures, it will output gray scale image since the input image is in gray scale. If you use your own color pictures and the pretrained model with real images, I am not sure. I can check it on Monday. I never tried this before.

It seems that the run demo.sh for /real can only output gray pictures even the inputs are color pictures.

yes!

Okay, if you use the demo pictures, it will output gray scale image since the input image is in gray scale. If you use your own color pictures and the pretrained model with real images, I am not sure. I can check it on Monday. I never tried this before.

thanks

thanks

I tested it. Yes, you are right. It outputs gray scale image even I use color image with the pretrained model on real gray images. I think this is reasonable, since the model is trained with real gray images. If you want to deblur real color images, you can finetune the network with your own data, by initializing it with the model trained on Fastec dataset.

I tested it. Yes, you are right. It outputs gray scale image even I use color image with the pretrained model on real gray images. I think this is reasonable, since the model is trained with real gray images. If you want to deblur real color images, you can finetune the network with your own data, by initializing it with the model trained on Fastec dataset.

thank you