manicman1999/StyleGAN2-Tensorflow-2.0

loss problem

caishiqing opened this issue · 1 comments

I'm doubt that why generator loss is defined as gen_loss = K.mean(fake_output), maybe should it be gen_loss = K.mean(real_output) to confuse discriminator?

The generator can't influence the results of the real outputs (they come from the dataset itself), so it wouldn't make sense to use gen_loss = K.mean(real_output) as the gradients wouldn't flow through the generator. Generator losses are typically formulated as a function of D(G(z)) i.e. in terms of fake_output, such as -D(G(z)) in hinge loss or -log(D(G(z)) in non-saturating loss.