y0ast/Variational-Autoencoder

order of arguments in T.nnet.binary_crossentropy

aripakman opened this issue · 2 comments

I think that in line 119 of VAE.py, the order of the arguments should be

       logpxz = -reconstructed_x T.nnet.binary_crossentropy(x,reconstructed_x).sum(axis=1)

instead of

       logpxz = - T.nnet.binary_crossentropy(reconstructed_x,x).sum(axis=1)

Note the definition of the function (http://deeplearning.net/software/theano/library/tensor/nnet/nnet.html#theano.tensor.nnet.nnet.binary_crossentropy):

binary_crossentropy(t,o) = -t log(o) -(1-t) log(1-o)

In our case t is the given value x, and o is reconstructed_x, the symbolic expression.

y0ast commented

Sure, according to the math you're right. But the function is implemented as follows

theano.tensor.nnet.nnet.binary_crossentropy(output, target)

See also the example a few lines below that math line:

x, y, b, c = T.dvectors('x', 'y', 'b', 'c')
W = T.dmatrix('W')
V = T.dmatrix('V')
h = T.nnet.sigmoid(T.dot(W, x) + b)
x_recons = T.nnet.sigmoid(T.dot(V, h) + c)
recon_cost = T.nnet.binary_crossentropy(x_recons, x).mean()

Very confusing, would be great if you could make an issue at Theano to fix their documentation :)

I see, thanks!