This is the implementation of the scalable heteroscedastic GP (HGP) developed in "Haitao Liu, Yew-Soon Ong, and Jianfei Cai, Large-scale Heteroscedastic Regression via Gaussian Process." Please see the paper for further details.
We here focus on the heteroscedastic Gaussian process regression
To improve the scalability of HGP, we first develop a variational sparse inference algorithm, named VSHGP, to handle large-scale datasets. This is performed by introducing
This figure shows a toy example of distributed VSHGP (DVSHGP). Here, we partition the whole 500 training data into five subsets, yielding five VSHGP experts (marked with different colors) with their own inducing points for the latent function
- through five distributed local experts, DVSHGP can efficiently employ up to 100 inducing points for modeling, while at the same time
- DVSHGP successfully describes both the underlying function
$f$ and the heteroscedastic noise variance (captured by$g$ ).
To run the example file, execute:
Demo_DVSHGP_toy.m