google-deepmind/optax

Wrong formula for `softmax_cross_entropy` in online doc

ziyuanzhao2000 opened this issue · 1 comments

On the latest version of this page it says each element of the output vector from the softmax_cross_entropy should be

$$ \sigma_i = \log \left(\frac{\sum_j y_{ij} \exp(x_{ij}) }{\sum_j \exp(x_{ij})} \right) $$

But clearly, based on the source code it's a typo and should be the following instead:

$$ \sigma_i = -\sum_j y_{ij} \log \left(\frac{ \exp(x_{ij}) }{\sum_j \exp(x_{ij})} \right) $$

Please fix this so as not to confuse other users; thank you!

good catch @ziyuanzhao2000 . I proposed a fix here: #1041