podgorskiy/ALAE

MappingToLatent has no activation

eitanrich opened this issue · 2 comments

Is it intentional that the D module (MappingToLatent) consists of three F.Linear layers w/o any activations (e.g. no ReLU / Leaky ReLU)?

ALAE/net.py

Line 894 in 5d8362f

block = ln.Linear(inputs, outputs, lrmul=0.1)

That's an error. But I won't change it, to be consistent with the published results.
Most likely, the effect won't be significant, but I'm curious to see how the result will differ.

Maybe the behavior is related to Implicit Rank-Minimizing Autoencoder :)