[street] Totally black output on epoch 1
jodusan opened this issue · 4 comments
I have downloaded cityscapes dataset and started training. After about 1.5hrs in on v100 outputs in checkpoints/street/web are all black (images that end with synthesized_image.jpg) and I'm wondering is this expected? (if not, should there be something on output images from the beginning or not?)
Thanks
@Dulex123 I'm having the same issue for the pose dataset instead of black images I'm getting white ones were you able to solve it?
I have the similar problem. Do you have fix the issue ? Is it a small dataset problem ?
I have the same problem when training on pose/face example dataset.
On both example dataset, the synthesized_image.jpg looks normal on the first several hundreds iterations, but after some hundreds of iterations, the image synthesized become totally black or white(with a very small yellow margin of about 1 pixel).
I observed that Df_fake and Df_real dropped to zero at the iteration when the image become white/black.
Does anyone can help?
I had the same problem and I delved into the code and found that the matting function in the line 214 in ./models/networks/generator.py is somehow disabled (I don't know why ???) and the warped images are never combined into the final images in this code.
If the line 214 is changed as in below:
`if not self.spade_combine: ---> if self.spade_combine`
Then, the problem of "all-black or all-white synthesized images after around 3000 ~ 4000 iterations in the epoch 1" might be disappeared (please try by yourself).