问题
I'm trying to model the observation "time" as random variable with OpenBUGS via R (R2OpenBUGS). If all the observation times are available (no NA's) everything works, but if I set one of the times to NA, nothing happens. I tested the same code with WinBUGS, and I get trap error 'NIL dereference (read)'. So my question is that is there something really wrong in my code, or is my model too weird for BUGS?
My model is like this:
model{
for(i in 1:k){
obs[i] ~ dbern(p) #is the observation done at time 1 or 2?
y[(i-1)*2 + obs[i]+1] <- x[i]
}
for(i in 1:n){
y[i] ~ dnorm(mu,tau)
}
mu ~ dnorm(0,0.0001)
tau~ dgamma(0.001,0.001)
p ~ dunif(0,1)
}
And the R code looks like this:
library(R2OpenBUGS)
x<-obs<-rep(NA,5)
for(i in 1:k)
{
obs[i]<-sample(c(0,1),1) #observation time of ith observation
x[i]<-rnorm(1) #observed values
}
obs[2]<-NA #one of the sampling times is missing
INITS <- list(list(tau=1,mu=0,p=0.5))
DATA <- list(x=x,n=n,k=k,obs=obs)
ob <- bugs(
data=DATA,
inits=INITS,
parameters.to.save=c("tau","mu","p","y"),
model.file="BUGSModel.R",
n.chains=1,
n.iter=50,
n.burnin=10,
n.thin=1,
DIC=FALSE)
回答1:
If I understand your question well, you are asking if this expression
obs[i] ~ dbern(p)
is weird for Win/OpenBUGS so that it will not handle the missing value. No, I don't think so; bugs is able to handle missing values this way and it even imputes them - with posterior distribution.
But I have a strong suspicion that
y[(i-1)*2 + obs[i]+1] <- x[i]
is really weird! This could cause problems to bugs as you force to compute the index using the observation obs[i]
which is null. This is really weird, you should try to find another way to do it. First try to simplify the model to skip this rule and I'd bet that the problem disappears.
来源:https://stackoverflow.com/questions/9465825/openbugs-missing-value-in-bernoulli-distribution