JaxGaussianProcesses/GPJax

docs: Fit models with standard for loop + notebook on training loops.

Closed this issue · 3 comments

We have a convenient fit function to train GPs against objectives. It would be good though e.g., in the regression notebook to show a simple python for loop, and a simple lax.scan training loop, to demonstrate that users can write their own training loops. Give insight to, eg., the ConjugateMLL is something you can take jax.grad against and just do gradient descent on.

Then it would be good to link this to a more extensive notebook exposing users to stoping gradients, bijectors transformations etc, and show how to add a training bar to the loop.

There has been no recent activity on this issue. To keep our issues log clean, we remove old and inactive issues.
Please update to the latest version of GPJax and check if that resolves the issue. Let us know if that works for you by leaving a comment.
This issue is now marked as stale and will be closed if no further activity occurs. If you believe that this is incorrect, please comment. Thank you!

There has been no recent activity on this issue. To keep our issues log clean, we remove old and inactive issues.
Please update to the latest version of GPJax and check if that resolves the issue. Let us know if that works for you by leaving a comment.
This issue is now marked as stale and will be closed if no further activity occurs. If you believe that this is incorrect, please comment. Thank you!

There has been no activity on this PR for some time. Therefore, we will be automatically closing the PR if no new activity occurs within the next seven days.
Thank you for your contributions.