bayesian

OpenBUGS - Variable is not defined

穿精又带淫゛_ 提交于 2019-12-02 12:21:38
问题 This question was migrated from Cross Validated because it can be answered on Stack Overflow. Migrated 6 years ago . I'm using the following code in OpenBUGS to perform an analysis: model { for(i in 1:467) { probit(p[i])<-gamma0+gamma1*drug[i]+gamma2*CD41[i] R[i]~dbern(p[i]) junk[i]<-ID[i] } gamma0~dnorm(0,.0001) gamma1~dnorm(0,.0001) gamma2~dnorm(0,.0001) } ID[] drug[] CD41[] R[] 1 0 114 NA 2 1 40 NA 3 1 12 0 4 0 15 0 .... END And I'm receiving the following error: Variable CD41[] is not

“Multiple definition of node a” error in Winbugs

点点圈 提交于 2019-12-02 11:48:31
okay i just rewrite my code. Now the problem is when i compile it, I get an error of "multiple definition of node a" . Do anyone know what wrong in my code. I create the variable a,b and c for the model not to have many constants. model{ for(i in 1:n){ a <- (k[1] + step(s1[i]-.9)*k[2] + step(s1[i]*.5-.9)*k[3]) b <- (r[1] + step(s2[i]-.9)*r[2] + step(s2[i]*.5-.9)*r[3]) c <- (s[1] + step(s3[i]-.9)*s[2] + step(s3[i]*.5-.9)*s[3]) dummy[i] <- 0 dummy[i] ~ dloglik(logLike[i]) # This is the log transformation of the 3-variate poisson logLike[i] <- -theta12[i] + a*log(theta12[i]) - logfact(a) -theta13

Fit a bayesian linear regression and predict unobservable values

你。 提交于 2019-12-02 01:47:07
I'd like to use Jags plus R to adjust a linear model with observable quantities, and make inference about unobservable ones. I found lots of example on the internet about how to adjust the model, but nothing on how to extrapolate its coefficients after having fitted the model in the Jags environment. So, I'll appreciate any help on this. My data looks like the following: ngroups <- 2 group <- 1:ngroups nobs <- 100 dta <- data.frame(group=rep(group,each=nobs),y=rnorm(nobs*ngroups),x=runif(nobs*ngroups)) head(dta) JAGS has powerful ways to make inference about missing data, and once you get the

Modified BPMF in PyMC3 using `LKJCorr` priors: PositiveDefiniteError using `NUTS`

给你一囗甜甜゛ 提交于 2019-12-01 03:06:57
问题 I previously implemented the original Bayesian Probabilistic Matrix Factorization (BPMF) model in pymc3 . See my previous question for reference, data source, and problem setup. Per the answer to that question from @twiecki, I've implemented a variation of the model using LKJCorr priors for the correlation matrices and uniform priors for the standard deviations. In the original model, the covariance matrices are drawn from Wishart distributions, but due to current limitations of pymc3 , the

PyMC - variance-covariance matrix estimation

匆匆过客 提交于 2019-12-01 01:59:35
I read the following paper( http://www3.stat.sinica.edu.tw/statistica/oldpdf/A10n416.pdf ) where they model the variance-covariance matrix Σ as: Σ = diag(S)*R*diag(S) (Equation 1 in the paper) S is the k×1 vector of standard deviations, diag(S) is the diagonal matrix with diagonal elements S, and R is the k×k correlation matrix. How can I implement this using PyMC ? Here is some initial code I wrote: import numpy as np import pandas as pd import pymc as pm k=3 prior_mu=np.ones(k) prior_var=np.eye(k) prior_corr=np.eye(k) prior_cov=prior_var*prior_corr*prior_var post_mu = pm.Normal("returns"

NaiveBayes in R Cannot Predict - factor(0) Levels:

允我心安 提交于 2019-11-30 09:13:23
I have a dataset looks like this: data.flu <- data.frame(chills = c(1,1,1,0,0,0,0,1), runnyNose = c(0,1,0,1,0,1,1,1), headache = c("M", "N", "S", "M", "N", "S", "S", "M"), fever = c(1,0,1,1,0,1,0,1), flu = c(0,1,1,1,0,1,0,1) ) > data.flu chills runnyNose headache fever flu 1 1 0 M 1 0 2 1 1 N 0 1 3 1 0 S 1 1 4 0 1 M 1 1 5 0 0 N 0 0 6 0 1 S 1 1 7 0 1 S 0 0 8 1 1 M 1 1 > str(data.flu) 'data.frame': 8 obs. of 5 variables: $ chills : num 1 1 1 0 0 0 0 1 $ runnyNose: num 0 1 0 1 0 1 1 1 $ headache : Factor w/ 3 levels "M","N","S": 1 2 3 1 2 3 3 1 $ fever : num 1 0 1 1 0 1 0 1 $ flu : num 0 1 1 1 0

What is the difference between a Bayesian network and a naive Bayes classifier?

£可爱£侵袭症+ 提交于 2019-11-30 00:16:29
What is the difference between a Bayesian network and a Naive Bayes classifier? I noticed one is just implemented in Matlab as classify the other has an entire net toolbox. If you could explain in your answer which one is more likely to provide a better accuracy as well I would be grateful (not a pre-requisite). Richante Short answer, if you're only interested in solving a prediction task: use Naive Bayes. A Bayesian network (has a good wikipedia page) models relationships between features in a very general way. If you know what these relationships are, or have enough data to derive them, then

What is the difference between a Bayesian network and a naive Bayes classifier?

廉价感情. 提交于 2019-11-28 21:17:52
问题 What is the difference between a Bayesian network and a Naive Bayes classifier? I noticed one is just implemented in Matlab as classify the other has an entire net toolbox. If you could explain in your answer which one is more likely to provide a better accuracy as well I would be grateful (not a pre-requisite). 回答1: Short answer, if you're only interested in solving a prediction task: use Naive Bayes. A Bayesian network (has a good wikipedia page) models relationships between features in a

How to provide most relevant results with Multiple Factor Weighted Sorting

匆匆过客 提交于 2019-11-28 15:23:39
I need to provide a weighted sort on 2+ factors, ordered by "relevancy". However, the factors aren't completely isolated, in that I want one or more of the factors to affect the "urgency" (weight) of the others. Example: contributed content ( articles ) can be up-/down-voted, and thus have a rating; they have a post date, and they're also tagged with categories. Users write the articles and can vote, and may or may not have some kind of ranking themselves (expert, etc). Probably similar to StackOverflow, right? I want to provide each user with a list of articles grouped by tag but sorted by

Incremental model update with PyMC3

你说的曾经没有我的故事 提交于 2019-11-27 21:39:36
Is it possible to incrementally update a model in pyMC3. I can currently find no information on this. All documentation is always working with a priori known data. But in my understanding, a Bayesian model also means being able to update a belief. Is this possible in pyMC3? Where can I find info in this? Thank you :) Following @ChrisFonnesbeck's advice, I wrote a small tutorial notebook about incremental prior updating. It can be found here: https://github.com/pymc-devs/pymc3/blob/master/docs/source/notebooks/updating_priors.ipynb Basically, you need to wrap your posterior samples in a custom