vislearn/FrEIA

Enable different initializations in LearnedElementwiseScaling-Layer

niels-leif-bracher opened this issue · 1 comments

Hey Lynton,

I think the current definition of the parameters in self.s disable different initialization and self.s will always be initialized with zeros everywhere:

https://github.com/VLL-HD/FrEIA/blob/f59b449d3cfd249472290e3208d44dc43ecd0ba8/FrEIA/modules/inv_auto_layers.py#L148

This would allow different initialization:

self.s = nn.Parameter(np.log(init_scale) * torch.ones(1, *dims_in[0]))

Thanks for pointing out this bug! It's fixed now, see #118