Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Full pooling and a single study #178

Open
wwiecek opened this issue Apr 17, 2023 · 0 comments
Open

Full pooling and a single study #178

wwiecek opened this issue Apr 17, 2023 · 0 comments
Labels
good first issue Good for newcomers
Milestone

Comments

@wwiecek
Copy link
Owner

wwiecek commented Apr 17, 2023

Right now you can't do this:

fe7n <- baggr(schools[1:7,], pooling = "full")
fe8u <- baggr(schools[8, ], pooling = "full",
              prior_hypermean = normal(treatment_effect(fe7n, s=T)$tau[["mean"]],
                                       treatment_effect(fe7n, s=T)$tau[["sd"]]))

(1) for fe8u you get:

Automatically chose Rubin model with aggregate data based on input data.
Error in prepare_prior(prior, data, stan_data, model, pooling, covariates,  : 
  You must specify hyper-SD prior manually when data has only one row.

But in full pooling there is no problem, prior_hypersd is not needed at all. So let's fix that error message and write some unit tests that cover both the old and new cases.

(2) for fe7n you get

Setting hyper-SD prior using 10 times the naive SD across sites
* hypersd [sigma_tau] ~ uniform(0, 112)

But that prior is meaningless! It's kind of OK that it gets set and stored, but it's not used, so the user should not be seeing this message. (Although best would be to not set it at all.)

@wwiecek wwiecek added the good first issue Good for newcomers label Apr 17, 2023
@wwiecek wwiecek added this to the v0.7 milestone Dec 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

1 participant