Variational AE: KL-divergence
Closed this issue · 1 comments
Make42 commented
Regarding https://jaan.io/what-is-variational-autoencoder-vae-tutorial/
The KL divergence is
KL(P | Q) = sum_{i} P(i) log ( P(i) / Q(i) )
according to https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence#Definition
How do you get from that definition to your formula? I mean the one displayed after "We can use the Kullback-Leibler divergence, which measures the information lost when using qqq to approximate ppp (in units of nats):"
Starting form there, it got difficult to follow your (otherwise great) blog post.