depthfirstlearning/depthfirstlearning.com

InfoGAN: Looking for good explanation for relationship between JS divergence, Jensen's inequality and Shannon Entropy

avital opened this issue · 0 comments

Why is Jensen-Shannon divergence called this way?

The answer is something like this: https://dit.readthedocs.io/en/latest/measures/divergences/jensen_shannon_divergence.html#derivation (where "x" is a convex combination of P and Q), and the expectation is taken over the binary variable defining which of {P,Q} to take

We'd like a clear write-up of this definitely of the JS divergence, including a proof of the equivalence between this definition and the others.