Sampling from prior without running a separate model

独自空忆成欢 提交于 2019-12-05 21:25:35

There are two ways you can do this.

First, if the program is general enough, just pass in zero-size data so that the posterior is the prior. For instance, N = 0 in the regression example you gave will work (along with the right zero-sized x and y).

Second, you can write a pure Monte Carlo generator (doesn't use MCMC) in the generated quantities block. Something like:

generated quantities {
  real<lower = 0> sigma_sim = cauchy_rng(0, 2);  // wide tail warning!
  real beta_sim = normal_rng(0, 10);
  real alpha_sim = normal_rng(0, 20);
}

The second approach is much more efficient as it conveniently draws an independent sample and doesn't have to do any MCMC.

The answer to how to do this occurred to me on the bus this morning. Of course by the time I finished writing it out, @Bob Carpenter posted the solution I was looking for. By comparison my way is quite cumbersome and hacky, but it does work.

All we need to do is specify priors that mirror the actual priors but which are never passed downstream into a likelihood function.

So in the example above all we need to do is create these mirror variables within the model string. We'll call them p_alpha, p_beta, and p_sigma. These will be analogs of alpha, beta, and sigma but will not appear in any likelihood function.

Note we have to create these variables in the parameters{} block and in the model{} block.

ms <- "
  data {
    int<lower=0> N;
    vector[N] x;
    vector[N] y;
  }

  parameters {
    // priors to sample from
    real p_alpha;
    real p_beta;
    real p_sigma;

    // real priors
    real alpha;
    real beta;
    real<lower=0> sigma;
  }

  model {
    vector[N] mu;

    // priors to sample from
    p_sigma ~ cauchy(0, 2);
    p_beta ~ normal(3,1);  // for didactic purposes
    p_alpha ~ normal(0,100);

    // actual priors
    sigma ~ cauchy(0, 2);
    beta ~ normal(0,10);
    alpha ~ normal(0,100);

    // likelihood
    for ( i in 1:N ) {
    mu[i] = alpha + beta * x[i];
    }
    y ~ normal(mu, sigma);
  }
"

Note that the specifications of the distributions for the mirror parameters should match those of the actual priors, which I have done for p_alpha/alpha and p_sigma/sigma. For didactic purposes I have deliberately made the centre and spread of p_beta different from beta as I will graph these below on the same graph.

Now run the model again

fit1 <- stan(model_code = ms,     
             data = data_reg,       
             chains = 1,            
             warmup = 1e3,         
             iter = 2e3)  

And extract the samples

post <- as.data.frame(extract(fit1, pars = c("p_alpha", "p_beta", "p_sigma", "alpha", "beta", "sigma")))  

head(post)


# output
    p_alpha   p_beta     p_sigma    alpha     beta    sigma
1 -81.44259 3.275672  -1.1416369 3.121382 2.499459 2.354001
2 161.03740 3.694711   0.2989131 3.648288 2.335520 2.140973
3 126.58106 3.495947  -2.0027929 3.846835 2.266247 3.037055
4  18.55785 3.283425  -0.4045153 2.903958 1.854639 1.807591
5 103.02826 5.213568 -18.3721863 3.980290 1.725396 2.178264
6  49.50477 1.737679   6.5971377 4.209471 2.535044 2.941958

Here are the priors and posteriors as separate plots

So now we have raw priors and posteriors for the same parameters in the same dataframe.

Now what if we want to put prior and posterior on the same graph?

First put the two parameters p_beta and beta into a dataframe, making it long-form so that estimates are in one column and distribution (prior vs posterior) in the other.

library(dplyr)
betaDF <- post %>% dplyr::select(grep("^.*beta$", names(.))) %>%
                   gather(key = source, value = estimate) %>%
                   transform(source = factor(ifelse(source == "p_beta", "prior", "posterior"), levels = c("prior", "posterior")))

Now graph it

ggplot(betaDF, aes(x = estimate, fill = source)) +
       geom_density(alpha = 0.3) +
       coord_cartesian(xlim = c(-5,10)) +
       labs(x = "beta")

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!