edenton/svg

KL loss doesn't decrease

Opened this issue · 2 comments

Hi,
Thanks for sharing the codes, nice work!
When I train the learned prior model on smmnist dataset, the reconstruction loss decreases;
However, the KL loss kept increasing. Is it normal?

Thanks.

I have the same problem. I think it is normal as long as the total loss is decreasing.
The reason why KL loss increases might be the initial KL loss is very small.

Intuitively if the KL loss goes to 0, that seems problematic since otherwise the posterior isn't any different from the prior?

For reference, running SM-MNIST using default commands here for SVG-LP gives KL values of about 5-ish.