lm

update() a model inside a function with local covariate

断了今生、忘了曾经 提交于 2019-12-23 07:56:20
问题 I need to update a regression model from inside a function. Ideally, the function should work with any kind of models ( lm , glm , multinom , clm ). More precisely, I need to add one or several covariates that are defined inside the function. Here is an exemple. MyUpdate <- function(model){ randData <- data.frame(var1=rnorm(length(model$residuals))) model2 <- update(model, ".~.+randData$var1") return(model2) } Here is an example use data(iris) model1 <- lm(Sepal.Length~Species, data=iris)

LASSO with $\lambda = 0$ and OLS produce different results in R glmnet

送分小仙女□ 提交于 2019-12-23 07:02:14
问题 I expect LASSO with no penalization ($\lambda=0$) to yield the same (or very similar) coefficient estimates as an OLS fit. However, I get different coefficient estimates in R putting the same data (x,y) into glmnet(x, y , alpha=1, lambda=0) for LASSO fit with no penalization and lm(y ~ x) for OLS fit. Why is that? 回答1: You're using the function wrong. The x should be the model matrix. Not the raw predictor value. When you do that, you get the exact same results: x <- rnorm(500) y <- rnorm(500

Showing equation of nls model with ggpmisc

♀尐吖头ヾ 提交于 2019-12-23 06:57:47
问题 R package ggpmisc can be used to show equation of lm model and poly model on ggplot2 (See here for reference). Wonder how to show nls model equation results on ggplot2 using ggmisc . Below is my MWE. library(ggpmisc) args <- list(formula = y ~ k * e ^ x, start = list(k = 1, e = 2)) ggplot(mtcars, aes(wt, mpg)) + geom_point() + stat_fit_augment(method = "nls", method.args = args) 回答1: Inspired by the post you linked. Use geom_text to add the label after extracting parameters. nlsFit <- nls

How to use lm.fit instead on lm [closed]

試著忘記壹切 提交于 2019-12-23 06:40:08
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 6 years ago . I want to use lm.fit for speed, but second version gives NAs sum <- summary(lm(y~x)) slope <- sum$coefficients[2] or sum <- lm.fit(as.matrix(x,ncol=1),y) slope <- sum$coefficients[2] EDIT 1 I now see that sum

Create a new lm object using R?

末鹿安然 提交于 2019-12-23 05:25:49
问题 Assuming I have an X matrix and y vector such as the following: X= [,1] [,2] [,3] [1,] 83.0 234.289 235.6 ... [2,] 88.5 259.426 232.5 ... [3,] 88.2 258.054 368.2 ... y= [1] 60.323 61.122 60.171... After conducting a decomposition, I find the coefficients B and residuals E. How can I create a new lm object (ie lmQ) that stores my results for B and E in order to get something like this: > lmQ(X) $coefficients X1 X2 B1 B2 $residuals [1] R1 R2 R3... Please help thanks! 回答1: Objects are just lists

How to conduct linear hypothesis test on regression coefficients with a clustered covariance matrix?

狂风中的少年 提交于 2019-12-23 02:05:11
问题 I am interested in calculating estimates and standard errors for linear combinations of coefficients after a linear regression in R. For example, suppose I have the regression and test: data(mtcars) library(multcomp) lm1 <- lm(mpg ~ cyl + hp, data = mtcars) summary(glht(lm1, linfct = 'cyl + hp = 0')) This will estimate the value of the sum of the coefficients on cyl and hp , and provide the standard error based on the covariance matrix produced by lm . But, suppose I want to cluster my

How to set contrasts for my variable in regression analysis with R?

狂风中的少年 提交于 2019-12-22 19:51:12
问题 During coding, I need to change the dummy value assigned to a factor. However, the following code does not work. Any suggestion? test_mx= data.frame(a= c(T,T,T,F,F,F), b= c(1,1,1,0,0,0)) test_mx a b 1 TRUE 1 2 TRUE 1 3 TRUE 1 4 FALSE 0 5 FALSE 0 6 FALSE 0 model= glm(b ~ a, data= test_mx, family= "binomial") summary(model) model= glm(a ~ b, data= test_mx, family= "binomial") summary(model) Here I will get the coef for b is 47. Now if I swap the dummy value, it should be -47 then. However, this

How to set up balanced one-way ANOVA for lm()

会有一股神秘感。 提交于 2019-12-22 12:38:53
问题 I have data: dat <- data.frame(NS = c(8.56, 8.47, 6.39, 9.26, 7.98, 6.84, 9.2, 7.5), EXSM = c(7.39, 8.64, 8.54, 5.37, 9.21, 7.8, 8.2, 8), Less.5 = c(5.97, 6.77, 7.26, 5.74, 8.74, 6.3, 6.8, 7.1), More.5 = c(7.03, 5.24, 6.14, 6.74, 6.62, 7.37, 4.94, 6.34)) # NS EXSM Less.5 More.5 # 1 8.56 7.39 5.97 7.03 # 2 8.47 8.64 6.77 5.24 # 3 6.39 8.54 7.26 6.14 # 4 9.26 5.37 5.74 6.74 # 5 7.98 9.21 8.74 6.62 # 6 6.84 7.80 6.30 7.37 # 7 9.20 8.20 6.80 4.94 # 8 7.50 8.00 7.10 6.34 Each column gives data

Differences in Linear Regression in R and Python [closed]

99封情书 提交于 2019-12-22 12:17:10
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 3 years ago . I was trying to match the linear regression R results with that of python Matching the coefficients for each of independent variable and below is the code: Data is uploaded. https://www.dropbox.com/s/oowe4irm9332s78/X.csv?dl=0 https://www.dropbox.com/s/79scp54unzlbwyk/Y.csv?dl=0 R code: #define pathname = " " X

regression on subsets for unique factor combinations using lm

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-22 09:37:49
问题 I would like to automate a simple multiple regression for the subsets defined by the unique combinations of the grouping variables. I have a dataframe with several grouping variables df1[,1:6] and some independent variables df1[,8:10] and a response df1[,7]. This is an excerpt from the data. structure(list(Surface = structure(c(1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L), .Label = c("NiAu", "Sn"), class = "factor"), Supplier = structure(c(1L, 1L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L), .Label =