Activation Functions tf.maximum
kopytjuk opened this issue · 1 comments
kopytjuk commented
Why do you use tf.maximum(conv, conv*alpha) as an activation function - try to use ReLU (tf.nn.relu) instead! It should help!
The next problem was scaling - the pictures you took were already scaled to 0, 1 :)) working version here:
https://github.com/kopytjuk/deep-emoji-gan/blob/master/Smiley_Generator.ipynb
anoff commented
It's a leaky ReLU