-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integration with Turing and NestedSamplers #29
Comments
""" Example
""" |
Getting the following error while using |
Are you on version 0.3? |
Not yet.. i used NestedSamplers version 0.1 with Julia version 1.2 ..
updating it all now
…On Wed, Jun 3, 2020 at 12:08 AM Miles Lucas ***@***.***> wrote:
Are you on version 0.3?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#29 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGZ32SB72SZV3FCUCUPN2JDRUVBLXANCNFSM4M3NTL4Q>
.
|
Any news on this? Integration between TuringLang and NestedSamplers would be a dream. |
The integration with nested sampling should become trivial once TuringLang/DynamicPPL.jl#309 is merged. |
Great! Does the PR include an example of this integration by any chance? Thank you for your hard work. |
The few times I've tried figuring out how to do this integration, I've given up because I didn't understand how Turing worked internally. It was never clear to me how to compute the prior transform or loglikelihood methods using the DynamicPPL API. For example, in the current NestedSamplers framework, I need to know the number of parameters, and have a @yebai how will TuringLang/DynamicPPL.jl#309 make this trivial? It is not clear to me from the PR discussion. |
That PR will essentially just make it easier to do something like loglikelihood(model, namedtuple) or loglikelihood(model, vector) etc. with very high-performance, i.e. make it super-easy to evaluate the density of (a large subset of but not all definable) models. This would mean that you could make NestedSamplers.jl work with |
What kinds of models might not work? And is there a timeline on the PR? |
Next to evaluating the likelihood, how would one efficiently implement the |
Well, all models would technically work but depending on what the underlying storage is, you might need to do some manual labour:
You can check out the docs for the PR here: https://turinglang.github.io/DynamicPPL.jl/previews/PR309/#DynamicPPL.SimpleVarInfo And docstring for
This is open-source bby, there are no timelines! 🎸 Nah I kid:) That PR is my highest priority in my open-source work, so it shouldn't be long now. I'm just ironing out some quirks and making sure that all test-cases are covered. This is a pretty significant change, so we need to make sure that it's not a bad idea.
So this is actually something I'm somewhat confused about: why do we need to work on the unit-cube? I'm new to nested sampling, so please forgive my ignorance 🙃 But in general, we deal with transformations between spaces in Turing.jl too; in fact we have an entire package to deal with this (Bijectors.jl). So IIUC, this shouldn't be an issue since we can just provide the transform using a composition of the bijectors. But we could also provide a function which extracts the distributions used, if need be. |
This is not strictly necessary from a technical point of view. In order to perform nested sampling, one only need to sample from a constrained prior, or sample from the unconstrained prior, then perform a rejection sampling step. However, it seems very popular in astrophysics to sample from a unit cube to simplify implementation. @mileslucas Would it be possible to directly work with the prior? |
It seems the current ‘prior_transform’ is coupled with proposal implementations. That makes working directly with priors slightly more involved - we need to implement new proposals that operate in the parameter space instead of the unit-cube space. The current implementation also limits the applicable model of nested sampling to cases whose prior can be factorised into univariate distributions with known quantile functions. I think a near-term integration goal should be supporting the same family of models that nested sampling can handle, i.e. models with priors that can be written as a product of independent univariate distributions. |
Initially it seemed like it's just that it's convenient, e.g. https://dynesty.readthedocs.io/en/latest/overview.html#challenges:
But it seems to go a bit deeper than this, e.g. Footnotes
|
Any news on this? |
I’m working on it, although I’ve been pretty busy these past few weeks. If I had to make a guess, maybe next month? But it depends on a lot. |
That's very nice to hear. Thank you for your work. If I can help with
testing stuff, let me know.
…On Fri, Feb 18, 2022 at 3:12 AM Carlos Parada ***@***.***> wrote:
I’m working on it, although I’ve been pretty busy these past few weeks.
—
Reply to this email directly, view it on GitHub
<#29 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKJ35C6LTZ5IFVRFGMUMMJDU3WTJ3ANCNFSM4M3NTL4Q>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
You are receiving this because you commented.Message ID:
***@***.***>
|
Bump |
NestedSamplers is an independent sampler library at the moment and lacks an interface for models specified by the domain-specific language (DSL) in Turing. However, since NestedSampler supports the
AbstractMCMC
interface, it should be a relatively lightweight task to add such an interface. A good starting point is the elliptical slice sampler (ESS) interface, which can be found atThe text was updated successfully, but these errors were encountered: