/GPyTorch-Wrapper

Primary LanguagePythonMIT LicenseMIT

GP-Wrapper

GP-Wrapper is the front-end for GPyTorch. It abstracts away the training loop, making a lot of boilerplate code obsolete. A simple GP.fit(X, y) is enough.

We use skorch as inspiration, and adapted its code for our front-end design.

Four steps to work with our GPyTorch Wrapper:

1. Define a GP model

2. wrap the model into one of the following GPwrappers:

- ExactGaussianProcess (Use criterion = gpytorch.mlls.ExactMarginalLogLikelihood as default)

     - ExactGaussianProcessRegressor  (Additionally, use likelihood = GaussianLikelihood as default)
     
- VariationalGaussianProcess (Use criterion = gpytorch.mlls.VariationalMarginalLogLikelihood as default)

     - VariationalGaussianProcessRegressor (Additionally, use likelihood = GaussianLikelihood as default)
     
     - VariationalGaussianProcessClassifier (Additionally, use likelihood = BernoulliLikelihood as default)

3. fit(x_train, y_train): # Find optimal model hyperparameters with default optimizer torch.optim.Adam

4. predict_proba(x_test): # Return a GaussianRandomVariable as the predictive outputs for x_test

Which notebooks to read

If you're just starting work with Gaussian processes, check out the simple regression and classification. These show the most basic usage of GPyTorch and provide links to useful reading material.

If you have a specific task in mind, then check out this flowchart to find which notebook will help you.

Here's a verbal summary of the flow chart:

Regression

Do you have lots of data?

No: Start with the basic example

Is your training data one-dimensional?

Yes: Use KissGP regression

Does your output decompose additively?

Yes: Use Additive Grid Interpolation

Is your trainng data three-dimensional or less?

Yes: Exploit Kronecker structure

No: Try Deep Kernel regression (example pending)

Variational Regression (new!)

Try this if:

  • You have too much data for exact inference, even with KissGP/Deep kernel learning/etc.
  • Your model will need variational inference anyways (e.g. if you're doing some sort of clustering)

See the example for more info.

Multitask Regression

See the example for more info.

Classification

Do you have lots of data?

No: Start with the basic example

Is your training data one-dimensional?

Yes: Use KissGP classification

Does your output decompose additively?

Yes: Use Additive Grid Interpolation

Is your training data three-dimensional or less?

Yes: Exploit Kronecker structure

No: Try Deep Kernel classification (under construction)