biuyq/CT-GAN

inplimatation details about the consistency regularization (CT)

Opened this issue · 0 comments

Hi, I am confused about the implementation in the code file CT-GAN/Theano_classifier/CT_MNIST, detailed below,

mom_gen = T.mean(LL.get_output(layers[-3], gen_dat), axis=0)
mom_real = T.mean(LL.get_output(layers[-3], x_unl), axis=0)
loss_gen = T.mean(T.square(mom_gen - mom_real))

I am not sure why there has LL.get_output(layers[-3], gen_dat), since Eq.(5) just contains the last-two-layer outputs in the paper copied below.
image

But the LL.get_output(layers[-3], gen_dat) actually refers to the third-to-last layer output. And the loss loss_gen is joined to the model training detailed in the code file CT-GAN/Theano_classifier/CT_MNIST (for convenience, copied below as well).

image

It seems different from Eq.(5) and makes me confusing. Hope you can give me some help to disentangle this confusion. Thanks in advance.