-
Notifications
You must be signed in to change notification settings - Fork 7
Frequently Asked Questions
While there are a number of possible causes of this, it is most commonly due to a lack of RAM on your computer. The most memory intensive component is often storing the Hessian used for the Laplace approximation. A crude method for calculating your memory requirement 11*(N*D)^2/(10^9)
(N=number of samples, D=number of multinomial categories; given in gigabytes). If you don't have enough memory fido has an alternative uncertainty quantification algorithm called the Multinomial-Dirichlet Bootstrap. Basically its multinomial-dirichlet draws centered on the MAP estimate. Pretend the following line is your call to fit a pibble model:
fit <- pibble(Y, X)
you can change this by adding:
fit <- pibble(Y, X, multDirichletBoot=0.5, calcGradHess=FALSE)
The parameter multDirichletBoot
can be any scalar >0 and represents the prior parameters for the Dirichlet (think of it like a pseudo-count; so values >1 are probably not what you want). Note: Currently only available on development branch.
Another option is to just use the MAP estimate without uncertainty quantification. This is probably fine for some people who are using fido just for hypothesis generation:
fit <- pibble(Y, X, n_samples=0, calcGradHess=FALSE)
Rarely you could have memory problems because you are trying to produce too many posterior samples (n_samples
is too big for the memory on your computer). There are a number of ways around this but I have yet to really hear of people run into this problem (so I will write up those solutions later).