linear-regression

Creating new Functions with Linear Regression in R :

拥有回忆 提交于 2020-08-26 15:17:37
问题 I'm having a trouble when creating a function that calls the lm() function: regresionLineal <- function (vardep, varindep1, varindep2, DATA) { lm(vardep ~ varindep1 + varindep2, data = DATA) } Then I call it using data from a data frame I created previously ( DATOS )... regresionLineal(Estatura, Largo, Ancho, DATOS) Error in eval(expr, envir, enclos) : object 'Estatura' not found Called from: eval(expr, envir, enclos) Any help will be welcome... 回答1: You should do: regresionLineal <- function

Fastest way to calculate many regressions in python?

孤街醉人 提交于 2020-08-25 06:00:48
问题 I think I have a pretty reasonable idea on how to do go about accomplishing this, but I'm not 100% sure on all of the steps. This question is mostly intended as a sanity check to ensure that I'm doing this in the most efficient way, and that my math is actually sound (since my statistics knowledge is not completely perfect). Anyways, some explanation about what I'm trying to do: I have a lot of time series data that I would like to perform some linear regressions on. In particular, I have

Get confidence intervals for regression coefficients of “mlm” object returned by `lm()`

限于喜欢 提交于 2020-08-25 04:25:26
问题 I'm running a multivariate regression with 2 outcome variables and 5 predictors. I would like to obtain the confidence intervals for all regression coefficients. Usually I use the function lm but it doesn't seem to work for a multivariate regression model (object mlm ). Here's a reproducible example. library(car) mod <- lm(cbind(income, prestige) ~ education + women, data=Prestige) confint(mod) # doesn't return anything. Any alternative way to do it? (I could just use the value of the

Access standardized residuals, cook's values, hatvalues (leverage) etc. easily in Python?

拜拜、爱过 提交于 2020-08-21 13:36:13
问题 I am looking for influence statistics after fitting a linear regression. In R I can obtain them (e.g.) like this: hatvalues(fitted_model) #hatvalues (leverage) cooks.distance(fitted_model) #Cook's D values rstandard(fitted_model) #standardized residuals rstudent(fitted_model) #studentized residuals etc. How can I obtain the same statistics when using statsmodels in Python after fitting a model like this: #import statsmodels import statsmodels.api as sm #Fit linear model to any dataset model =

Linear combination of regression coefficients in R [closed]

放肆的年华 提交于 2020-07-31 04:55:25
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 2 years ago . Improve this question I need to run a multiple regression in R, with the variables X1, X2 and X3, where there is a variable θ = β2 + β3. So instead of β2, for the coefficient of X2 I need to use (θ - β3). How could I do this? 回答1: Note that Y = b1 * x1 + (t - b3) * x2 + b3 * x3 is equivalent to Y =

Creating a loop through a list of variables for an LM model in R

泪湿孤枕 提交于 2020-07-30 09:06:01
问题 I am trying to create multiple linear regression models from a list of variable combinations (I also have them separately as a data-frame if that is more useful!) The list of variables looks like this: Vars x1+x2+x3 x1+x2+x4 x1+x2+x5 x1+x2+x6 x1+x2+x7 The loop I'm using looks like this: for (i in 1:length(var_list)){ lm(independent_variable ~ var_list[i],data = training_data) i+1 } However it is not recognizing the string of var_list[i] which gives x1+x2+x3 etc. as a model input. Does any-one

Simple prediction using linear regression with python

狂风中的少年 提交于 2020-07-06 10:49:35
问题 data2 = pd.DataFrame(data1['kwh']) data2 kwh date 2012-04-12 14:56:50 1.256400 2012-04-12 15:11:55 1.430750 2012-04-12 15:27:01 1.369910 2012-04-12 15:42:06 1.359350 2012-04-12 15:57:10 1.305680 2012-04-12 16:12:10 1.287750 2012-04-12 16:27:14 1.245970 2012-04-12 16:42:19 1.282280 2012-04-12 16:57:24 1.365710 2012-04-12 17:12:28 1.320130 2012-04-12 17:27:33 1.354890 2012-04-12 17:42:37 1.343680 2012-04-12 17:57:41 1.314220 2012-04-12 18:12:44 1.311970 2012-04-12 18:27:46 1.338980 2012-04-12