Probit regression with data augmentation in stan

戏子无情 提交于 2019-12-03 16:27:47

You could do data { int<lower=0> N; // number of obs int<lower=0> K; // number of predictors vector<lower=-1,upper=1> sign; // y = 0 -> -1, y = 1 -> 1 matrix[N, K] x; // predictor variables } parameters { vector[K] beta; // beta coefficients vector<lower=0>[N] abs_ystar; // latent variable } model { beta ~ normal(0, 100); // ignore the warning about a Jacobian from the parser sign .* abs_ystar ~ normal(x * beta, 1); }

That said, there is no reason to do data augmentation in Stan for a binary probit model, unless some of the outcomes were missing or something. It is more straightforward (and reduces the parameter space to K instead of K + N) to do data { int<lower=0> N; // number of obs int<lower=0> K; // number of predictors int<lower=0,upper=1> y[N]; // outcomes matrix[N, K] x; // predictor variables } parameters { vector[K] beta; // beta coefficients } model { vector[N] mu; beta ~ normal(0, 100); mu <- x*beta; for (n in 1:N) mu[n] <- Phi(mu[n]); y ~ bernoulli(mu); } If you really care about the latent utility, you could generate it via rejection sampling in the generated quantities block, like this generated quantities { vector[N] ystar; { vector[N] mu; mu <- x * beta; for (n in 1:N) { real draw; draw <- not_a_number(); if (sign[n] == 1) while(!(draw > 0)) draw <- normal_rng(mu[n], 1); else while(!(draw < 0)) draw <- normal_rng(mu[n], 1); ystar[n] <- draw; } } }

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!