MappingToLatent has no activation
eitanrich opened this issue · 2 comments
eitanrich commented
Is it intentional that the D module (MappingToLatent) consists of three F.Linear layers w/o any activations (e.g. no ReLU / Leaky ReLU)?
Line 894 in 5d8362f
podgorskiy commented
That's an error. But I won't change it, to be consistent with the published results.
Most likely, the effect won't be significant, but I'm curious to see how the result will differ.
eitanrich commented
Maybe the behavior is related to Implicit Rank-Minimizing Autoencoder :)