-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Save samples as we go #2015
Comments
The AbstractMCMC interface is already designed to support exactly this use case with the interator and transducer support it provides (it's explained in the AbstractMCMC docs: https://turinglang.org/AbstractMCMC.jl/dev/api/#Iterator). I think Turing could just make it a bit more convenient by not deferring all transformations to the end of sampling (see the discussion in #2011). |
Let's start out with just making it a callback, and then we can potentially addd a kwarg for it at a later stage. This might also just be more suited for TuringCallbacks.jl to begin with. Then if people start using it, we can merge it into Turing.jl proper.
Though this is true (and I've given the same response to this before 😅 ), some users of PPL really just want to call
When you say "transformations", what do you mean exactly? We currently do |
That you will end up with something different from calling |
Gotcha 👍 Regarding the callback @JaimeRZP , I'd say just start out simple: implement a callback that has a buffer where it can hold some |
I think the priority of the callback should be save samples one by one to a format that can be easily exported to other programming languages. The audience that I have in mind probably has plotting pipelines that they would like to re-use. I understand that re-using the whole transitions to MCMCChains methods is tempting but I think it is suboptimal for what I the target user of this function wants. If the user wants all the nice functions of bundle_samples then I feel like we already offer that with the current code and the users will get that at the end of the sampling. I am happy to be persuaded though. |
After some discussion with @torfjelde, I have removed the keyword from |
Moved to TuringLang/TuringCallbacks.jl#44 |
Dear Tuing team,
I feel like Turing would benefit form adding the possibility of saving the samples on the go. For example in Astronomy, where likelihood evaluations can take of the order of seconds and chains converge in a couple of weeks, people really like saving the samples on the go and periodically checking the convergence.
At the moment I think the user can only save the samples once Turing is done sampling. However, we could offer saving samples as go with by adding a simple callback function that activates if the user provides a "chain_name" as a keyword to
sample
.I have drafted how this could be done for HMC samplers in this branch.
The text was updated successfully, but these errors were encountered: