Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduction to JuliaGPs + Turing #423

Merged
merged 3 commits into from
Sep 26, 2023
Merged

Introduction to JuliaGPs + Turing #423

merged 3 commits into from
Sep 26, 2023

Conversation

willtebbutt
Copy link
Member

@willtebbutt willtebbutt commented Sep 25, 2023

I've written up the tutorial I made for the BSU workshop into a TuringTutorial. I think it's okay to go as-is, but I suspect that people will have ideas on how it could be improved, so I'd like to iterate a bit before merging.

I would appreciate the following two kinds of feedback:

  1. comments on clarity,
  2. suggestions for additional analyses / modification that would improve the tutorial.

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Copy link
Member

@torfjelde torfjelde left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

I added two minor comments, but dope stuff:)

Comment on lines +130 to +135
Unfortunately, there is not a simple way to enforce monotonicity in the samples from a GP,
and we can see this in some of the plots above, so we must hope that we have enough data to
ensure that this relationship approximately holds under the posterior.
In any case, you can judge for yourself whether you think this is the most useful
visualisation that we can perform -- if you think there is something better to look at,
please let us know!
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe this is crazy (and I'm def not suggesting we put this in the tutorial), but could we perform a quick linear regression and use this as a prior for the GP?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmmmm I'm not entirely sure that I understand. If I've understood correctly, the suggestion is to look at the data in order to inform our prior, which I'm not sure makes sense to me.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this would indeed make it, I dunno, "semi-Bayesian" or something, but it could be a useful trick to get monotonicity (though it wouldn't guarantee it). But I guess the proper way would be to just treat it as a multivariate normal and model the mean directly?

Another thing, is it sensible to put a prior on the GP prior?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well in some ways we are here -- we're placing a prior over the kernel parameters, which is implicitely placing a prior over the GP prior. Are you imagining something non-parametric though?

tutorials/15-gaussian-processes/15_gaussian_processes.jmd Outdated Show resolved Hide resolved
Copy link
Member

@torfjelde torfjelde left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome stuff @willtebbutt :)

@willtebbutt willtebbutt merged commit a12fb9c into master Sep 26, 2023
1 check passed
@delete-merged-branch delete-merged-branch bot deleted the wct/gp-intro branch September 26, 2023 08:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants