theislab/batchglm

Remove log(exp(mu)) and log(exp(r)) where possible

Hoeze opened this issue · 4 comments

Hoeze commented

Current implementation of negative binomial distribution uses log_mu and log_r.
Directly providing log_mu and log_r saves computation time and improves numeric stability.

I will hard-code the maths of the log-likelihood as a tensorflow graph in a separate function and link that function to likelihood evaluations in the model in BasicModelGraph, that should replace all calls to the previous implementation which contains this redundancy because of its interface to BasicModelGraph. I will do a quick benchmark how this affects run time.

Hoeze commented

You can look into the rsa_dev branch, there is already such an TF-NB implementation

Hoeze commented

Although without the improvement mentioned here and not very clean...