Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential interesting ESS improvement based on mixture of Gaussians #12

Open
yebai opened this issue Dec 14, 2020 · 1 comment
Open

Comments

@yebai
Copy link
Member

yebai commented Dec 14, 2020

The paper below proposes an interesting extension of ESS that handles models with non-Gaussian prior, and models with Gaussian priors but with informative likelihoods. The authors also motivate their algorithm for parallelisable implementation. If this work well in practice, it could be an interesting gradient-free alternative to the HMC/NUTS sampler for low to mid dimensional problems.

Nishihara, R., Murray, I., & Adams, R. P. (2014). Parallel MCMC with Generalized Elliptical Slice Sampling. Journal of Machine Learning Research: JMLR, 15(61), 2087–2112.

@imurray @robertnishihara

@devmotion
Copy link
Member

I read the paper quite some time ago but never tried it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants