peterroelants/peterroelants.github.io

Clarification with notes on "Understanding Gaussian Processes"

larrylawl opened this issue · 1 comments

Hi Peter, first thanks so much for putting your notes on machine learning online - I found the article "Understanding Gaussian processes" particularly rigorous and helpful.

Can I please clarify two things in that particular post?

  1. In the section "Predictions from posterior", can I please verify if the computations for the conditional distribution is correct? Specifically,

\mu_{2 | 1} &= \mu_{2}+ \Sigma_{21} \Sigma_{11}^{-1}\left(\mathbf{y}{1}-\mu{1}\right)

\Sigma_{2 | 1}=\Sigma_{22}-\Sigma_{21} \Sigma_{1}^{-1} \Sigma_{12}

should be

\mu_{2 | 1} &=\mu_{2}+ \Sigma_{12} \Sigma_{22}^{-1}\left(\mathbf{y}{1}-\mu{1}\right)

\Sigma_{2 | 1}=\Sigma_{22}-\Sigma_{12} \Sigma_{22}^{-1} \Sigma_{21}

I've derived the computation based on your post on conditional distribution here.

  1. In the section "Predictions from posterior", you stated that "Keep in mind that y_1 and y_2 are jointly Gaussian since they both should come from the same function. Can I please clarify that "same function" means that both y_1 and y_2 came from the same gaussian distribution over functions f(x)?

Thanks for your time!

Hi Larry,

Please note that in the Gaussian process blogpost we are conditioning 2 on 1 (2|1), while in the blogpost on the conditional distribution we are conditioning x on y (x|y), if we align the variable we have x=1 and y=2. Note that the conditioning in swapped in this case (y|x).