pymc

PyMC regression of many regressions?

白昼怎懂夜的黑 提交于 2019-12-12 05:49:21
问题 I haven't been using PyMC for long, but I was pleased at how quickly I was able to get a linear regression off the ground (this code should run without modification in IPython): import pandas as pd from numpy import * import pymc data=pd.DataFrame(rand(40)) predictors=pd.DataFrame(rand(40,5)) sigma = pymc.Uniform('sigma', 0.0, 200.0, value=20) params= array([pymc.Normal('%s_coef' % (c), mu=0, tau=1e-3,value=0) for c in predictors.columns]) @pymc.deterministic(plot=False) def linear_regression

KeyError while printing trace in PyMC

独自空忆成欢 提交于 2019-12-12 01:39:16
问题 I had read that by default some names are assigned to Stochastic vaiables. I am writing the relevant portion of my code below. lam = pm.Uniform('lam', lower=0.0, upper=5, doc='lam') parameters = pm.Dirichlet('parameters',[1,1,1,1], doc='parameters') rv = [ pm.Multinomial("rv"+str(i), count[i], prob_distribution[i], value = data[i], observed = True) for i in xrange(0, len(count)) ] m = pm.MCMC([lam, parameters, rv]) m.sample(10) print m.trace('lam')[:] print m.trace('parameters_0')[:] The last

PyMC2 and PyMC3 give different results…?

走远了吗. 提交于 2019-12-11 11:36:28
问题 I'm trying to get a simple PyMC2 model working in PyMC3. I've gotten the model to run but the models give very different MAP estimates for the variables. Here is my PyMC2 model: import pymc theta = pymc.Normal('theta', 0, .88) X1 = pymc.Bernoulli('X2', p=pymc.Lambda('a', lambda theta=theta:1./(1+np.exp(-(theta-(-0.75))))), value=[1],observed=True) X2 = pymc.Bernoulli('X3', p=pymc.Lambda('b', lambda theta=theta:1./(1+np.exp(-(theta-0)))), value=[1],observed=True) model = pymc.Model([theta, X1,

Negative Binomial Mixture in PyMC

旧巷老猫 提交于 2019-12-11 06:56:04
问题 I am trying to fit a Negative binomial mixture with PyMC. It seems I do something wrong, because the predictive doesn't look at all similar to the input data. The problem is probably in the prior of the Negative binomial parameters. Any suggestions? from sklearn.cluster import KMeans import pymc as mc n = 3 #Number of components of the mixture ndata = len(data) dd = mc.Dirichlet('dd', theta=(1,)*n) category = mc.Categorical('category', p=dd, size=ndata) kme = KMeans(n) # This is not needed

Passing parameters to deterministic variables, pymc

与世无争的帅哥 提交于 2019-12-11 05:26:08
问题 I am trying to implement a very simple example of the law of large numbers using PyMC. The goal is to generate many sample averages of samples of different sizes. For example, in the code below, I'm taking repeatedly taking groups of 5 samples (samples_to_average = 5), calculating their mean, and then finding the 95% CI of the resulting trace. The code below runs, but what I'd like to do is modify samples_to_average to be a list, so that I can calculate confidence intervals for a range of

PyMC3- Custom theano Op to do numerical integration

我是研究僧i 提交于 2019-12-11 04:08:40
问题 I am using PyMC3 for parameter estimation using a particular likelihood function which has to be defined. I googled it and found out that I should use the densitydist method for implementing the user defined likelihood functions but it is not working. How to incorporate a user defined likelihood function in PyMC3 and to find out the maximum a posteriori (MAP) estimate for my model? My code is given below. Here L is the analytic form of my Likelihood function. I have some observational data

PyMC error : hasattr(): attribute name must be string

不问归期 提交于 2019-12-11 01:45:41
问题 I'm having an issue running inference on a model in PyMC. I'm trying to run MCMC over a fairly complicated model, and I'm getting an error about hasattr(): attribute name must be string I get this on the final line of this block of code ( apologies, it's complicated, but I'm really not sure where the issue might be ). import pymc from matplotlib import pyplot as plt import numpy as np # a is a temp variable # A is the data : a (2, 779)-shaped array of 0 and 1 only a = np.loadtxt("PLOM3/data

How to use pymc to parameterize a probabilistic graphical model?

元气小坏坏 提交于 2019-12-10 14:43:26
问题 How can one use pymc to parameterize a probabilistic graphical model? Suppose I have a PGM with two nodes X and Y . Lets say X->Y is the graph. And X takes two values {0,1} , and Y also takes two values {0,1} . I want to use pymc to learn the parameters of the distribution and populate the graphical model with it for running inferences. The way I could think of is as follows: X_p = pm.Uniform("X_p", 0, 1) X = pm.Bernoulli("X", X_p, values=X_Vals, observed=True) Y0_p = pm.Uniform("Y0_p", 0, 1)

PyMC - wishart distribution for covariance estimate

时光毁灭记忆、已成空白 提交于 2019-12-10 11:49:41
问题 I need to model and estimate a variance-covariance matrix from asset class returns so I was looking at the stock returns example given in chapter 6 of https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers Here is my simple implementation where I start with a sample using a multivariate normal with a known mean and variance-covariance matrix. I then try to estimate it using a non-informative priror. The estimate is different from the known prior so I'm

Designing a simple Binomial distribution throws core dump in pymc

淺唱寂寞╮ 提交于 2019-12-10 11:44:41
问题 I am trying to design a simple binomial distribution in pymc. However it fails with the below error, the same code works fine if I use Poisson distribution instead of binomial import pymc as pm from pymc import Beta,Binomial,Exponential import numpy as np from pymc.Matplot import plot as mcplot data = pm.rbinomial(5,0.01,size=100) p = Beta("p",1,1) observations = Binomial("obs",5,p,value=data,observed=True) model = pm.Model([p,observations]) mcmc = pm.MCMC(model) mcmc.sample(400,100,2) mcplot