mixed-models

warning messages using glmer function from lme4 package R

∥☆過路亽.° 提交于 2019-12-24 19:02:05
问题 I am trying to fit a logistic random intercept model using glmer function from package lme4. Unfortunately I am getting the following warning messages and clearly wrong results (for the coefficients). Warning messages: 1: In vcov.merMod(object, use.hessian = use.hessian) : variance-covariance matrix computed from finite-difference Hessian is not positive definite: falling back to var-cov estimated from RX 2: In vcov.merMod(object, correlation = correlation, sigm = sig) : variance-covariance

Factor/level error in mixed model

无人久伴 提交于 2019-12-23 04:52:26
问题 I am running a mixed model on something akin to this data: df<-data.frame(stage=c("a","a","a","a","b","b","b","b","c","c","c","c"), nematode=c("fn","fn","bn","bn","fn","fn","bn","bn","fn","fn","bn","bn"), id2=c(1,2,3,4,1,2,3,4,1,2,3,4), value=c(1,0,0,2,3,1,1,2,0,0,0,2)) The model I am trying to fit is: stage.id <- function(x) round(summary(glmer(value ~ stage + (1 | id2),family="poisson", data = x))$coefficients[2, c(1, 2, 4)], 3) models.id0 <- ddply(tree2, .(stage, nematode), stage.id)

glmmLasso try-error for all lambda

别来无恙 提交于 2019-12-23 02:32:34
问题 I've been trying to use glmmLasso to do variable selection for a mixed-model but I can't seem to get the model to work. I've setup my model similarily to the demo found here. I'm using the simple method of using BIC to choose lambda. This is the code I've been running. library(glmmLasso) lambda <- seq(500,0,by=-5) family = binomial(link = logit) library(MASS);library(nlme) PQL<-glmmPQL(y~1,random = ~1|ID,family=family,data=train) Delta.start<-c(as.numeric(PQL$coef$fixed),rep(0,64),as.numeric

How to plot random intercept and slope in a mixed model with multiple predictors?

跟風遠走 提交于 2019-12-21 02:43:07
问题 Is it possible to plot the random intercept or slope of a mixed model when it has more than one predictor? With one predictor I would do like this: #generate one response, two predictors and one factor (random effect) resp<-runif(100,1, 100) pred1<-c(resp[1:50]+rnorm(50, -10, 10),resp[1:50]+rnorm(50, 20, 5)) pred2<-resp+rnorm(100, -10, 10) RF1<-gl(2, 50) #gamm library(mgcv) mod<-gamm(resp ~ pred1, random=list(RF1=~1)) plot(pred1, resp, type="n") for (i in ranef(mod$lme)[[1]]) { abline(fixef

test for significance of interaction in linear mixed models in nlme in R

蹲街弑〆低调 提交于 2019-12-20 12:36:05
问题 I use lme function in the nlme R package to test if levels of factor items has significant interaction with levels of factor condition . The factor condition has two levels: Control and Treatment , and the factor items has 3 levels: E1,...,E3 . I use the following code: f.lme = lme(response ~ 0 + factor(condition) * factor(items), random = ~1|subject) where subject is the random effect. In this way, when I run: summary(f.lme)$tTable I will get the following output: factor(condition)Control

Large fixed effects binomial regression in R

最后都变了- 提交于 2019-12-20 09:38:50
问题 I need to run a logistic regression on a relatively large data frame with 480.000 entries with 3 fixed effect variables. Fixed effect var A has 3233 levels, var B has 2326 levels, var C has 811 levels. So all in all I have 6370 fixed effects. The data is cross-sectional. If I can't run this regression using the normal glm function because the regression matrix seems too large for my memory (I get the message " Error: cannot allocate vector of size 22.9 Gb "). I am looking for alternative ways

How can I plot multiple residuals plots in a loop?

自古美人都是妖i 提交于 2019-12-18 09:28:42
问题 In the following example, I want to write the residuals plot of each model in a file. I do not need to see them in my display. for (i in 1:500){ temp.model<-lme(as.formula(paste("Var",i) ~ X1*X2, sep=""), data = example, random=~1| Exp/Person) jpeg(paste("C:/Myfolder", i, ".jpg", sep = ""), quality=50, bg="white") plot(temp.model) dev.off () graphics.off() } When I run this code without loop, I obtain what I want. However, it creates blank files within the loop. Any ideas? Thank you. 回答1: The

using profile and boot method within confint option, with glmer model

一曲冷凌霜 提交于 2019-12-14 04:21:05
问题 I am using glmer with a logit link for a gaussian error model. When I try obtaining the confidence intervals, using either profile or the boot method with the confint option, I obtain an error for use of profile likelihood, and with bootstrapping: > Profile: Computing profile confidence intervals ... Error in > names(opt) <- profnames(fm, signames) : 'names' attribute [2] must > be the same length as the vector [1] > > Boot: Error in if (const(t, min(1e-08, mean(t, na.rm = TRUE)/1e+06))) > {

get significance of simple effects with emtrends

折月煮酒 提交于 2019-12-14 02:44:57
问题 I can get the significance of pairwise comparisons with the following code m <- lmer(angle ~ recipe*temp + (1|replicate), data=cake) emtrends(m, pairwise~recipe, var="temp") $emtrends recipe temp.trend SE df lower.CL upper.CL A 0.1537143 0.02981898 250 0.09498586 0.2124427 B 0.1645714 0.02981898 250 0.10584300 0.2232999 C 0.1558095 0.02981898 250 0.09708110 0.2145379 $contrasts contrast estimate SE df t.ratio p.value A - B -0.010857143 0.0421704 250 -0.257 0.9641 A - C -0.002095238 0.0421704

Mixed model starting values for lme4

对着背影说爱祢 提交于 2019-12-13 15:30:56
问题 I am trying to fit a mixed model using the lmer function from the lme4 package. However, I do not understand what should be input to the start parameter. My purpose is to use a simple linear regression to use the coefficients estimated there as starting values to the mixed model. Lets say that my model is the following: linear_model = lm(y ~ x1 + x2 + x3, data = data) coef = summary(linear_model)$coefficients[- 1, 1] #I remove the intercept result = lmer(y ~ x1 + x2 + x3 | x1 + x2 + x3, data