theislab/batchglm

Model initialisation with reference model

davidsebfischer opened this issue · 4 comments

Why is this

init_loc = np.random.uniform(
a random initialisation within a very small interval rather than a constant? I do not see the benefit of introducing this stochasticity but it can break reproducibility. As this additive in log space, I would just set the parameters that did not occur in the full model to zero?

I can modify it if we decide to change this.

Hoeze commented

I always used this to make sure that not all values in a tensor are equal, I just copied this over.
This can also be set to 0 I think...

Ok, I will try it and merge if it works.