Why ReLU at line 120?
jwyang opened this issue · 2 comments
jwyang commented
Hi, Prof. Qi,
I am wondering why there is a ReLU layer at the top of discriminator (shown at line 120 in lsgan.lua)? With this ReLU layer, I found the code cannot train the model at all.
thanks,
LynnHo commented
I think it's for the non-negative loss function
LinxiaohanLL commented
新手,这个要是用什么软件运行?