lasso-regression

Comparing the GLMNET output of R with Python using LogisticRegression()

廉价感情. 提交于 2021-01-29 20:46:50
问题 I am using Logistic Regression with the L1 norm (LASSO). I have opted to used the glmnet package in R and the LogisticRegression() from the sklearn.linear_model in python . From my understanding this should give the same results however they are not. Note that I did not scale my data. For python I have used the below link as a reference: https://chrisalbon.com/machine_learning/logistic_regression/logistic_regression_with_l1_regularization/ and for R I have used the below link: http://www

How to access and compare LASSO model coefficients with MLR3 (glmnet learner)?

与世无争的帅哥 提交于 2020-06-29 03:53:32
问题 Goal Create a LASSO model using MLR3 Use nested CV with inner CV or bootstraps for hyperparameter (lambda) determination and outer CV for model performance evaluation (instead of doing just one test-train spit) and finding the standard deviation of the different LASSO regression coefficients amongst the different model instances. Do a prediction on a testing data set not available yet. Issues I am unsure whether the nested CV approach as described is implemented correctly in my code below. I

Is there an R-package to calculate pseudo R-squared measures for conditional (fixed effects) logistic models using clogit or bife?

坚强是说给别人听的谎言 提交于 2020-06-25 06:33:52
问题 Is there a R-package to calculate pseudo R-squared measures for my model? rcompanion neither supports clogit nor bife (due to missing intercept?). Originally that was one question out of a larger context, which I edited to make it more readable. Thanks in advance for your help! 回答1: Relative to question 5: Definitions for pseudo r-squared values based (mostly) on log likelihood values are given by UCLA IDRE. If you are able to extract the log likelihood from the fitted model and null model,

Is there an R-package to calculate pseudo R-squared measures for conditional (fixed effects) logistic models using clogit or bife?

那年仲夏 提交于 2020-06-25 06:33:14
问题 Is there a R-package to calculate pseudo R-squared measures for my model? rcompanion neither supports clogit nor bife (due to missing intercept?). Originally that was one question out of a larger context, which I edited to make it more readable. Thanks in advance for your help! 回答1: Relative to question 5: Definitions for pseudo r-squared values based (mostly) on log likelihood values are given by UCLA IDRE. If you are able to extract the log likelihood from the fitted model and null model,

negative value for “mean_squared_error”

◇◆丶佛笑我妖孽 提交于 2020-01-23 12:22:08
问题 I am using scikit and using mean_squared_error as a scoring function for model evaluation in cross_val_score. rms_score = cross_validation.cross_val_score(model, X, y, cv=20, scoring='mean_squared_error') I am using mean_squared_error as it is a regression problem and the estimators (model) used are lasso , ridge and elasticNet . For all these estimators, I am getting rms_score as negative values. How is it possible, given the fact that the differences in y values are squared. 回答1: You get

glmnet lasso ROC charts

会有一股神秘感。 提交于 2020-01-06 07:34:07
问题 I was using k-fold cross validation in glmnet (which implements lasso regression), but I can’t make the ROC charts from this. library(glmnet) glm_net <- cv.glmnet(dev_x_matrix,dev_y_vector,family="binomial",type.measure="class") phat <- predict(glm_net,newx=val_x_matrix,s="lambda.min") That gets me a vector with what looks like a log of the fitted values. I was trying to generate some ROC charts after this but it did not work. I think it is because of the nature of the x and y objects which

How to obtain coefficients from Lasso Regression in R?

谁说我不能喝 提交于 2020-01-04 14:30:11
问题 Can someone tell me how to get the coefficients for lasso regression in package lars in R? For example if the code is like: test_lasso=lars(A,B) Thank you. 回答1: #First get cross validation score: test_lasso_cv=cv.lars(A,B) # Find the best one bestfraction = test_lasso_cv$index[which.min(test_lasso_cv$cv)] #Find Coefficients coef.lasso = predict(test_lasso,A),s=bestfraction,type="coefficient",mode="fraction") 来源: https://stackoverflow.com/questions/25620511/how-to-obtain-coefficients-from

LASSO with $\lambda = 0$ and OLS produce different results in R glmnet

送分小仙女□ 提交于 2019-12-23 07:02:14
问题 I expect LASSO with no penalization ($\lambda=0$) to yield the same (or very similar) coefficient estimates as an OLS fit. However, I get different coefficient estimates in R putting the same data (x,y) into glmnet(x, y , alpha=1, lambda=0) for LASSO fit with no penalization and lm(y ~ x) for OLS fit. Why is that? 回答1: You're using the function wrong. The x should be the model matrix. Not the raw predictor value. When you do that, you get the exact same results: x <- rnorm(500) y <- rnorm(500

glmmLasso try-error for all lambda

别来无恙 提交于 2019-12-23 02:32:34
问题 I've been trying to use glmmLasso to do variable selection for a mixed-model but I can't seem to get the model to work. I've setup my model similarily to the demo found here. I'm using the simple method of using BIC to choose lambda. This is the code I've been running. library(glmmLasso) lambda <- seq(500,0,by=-5) family = binomial(link = logit) library(MASS);library(nlme) PQL<-glmmPQL(y~1,random = ~1|ID,family=family,data=train) Delta.start<-c(as.numeric(PQL$coef$fixed),rep(0,64),as.numeric

Why calculating MSE in lasso regression gives different outputs?

寵の児 提交于 2019-12-22 09:09:13
问题 I am trying to run different regression models on the Prostate cancer data from the lasso2 package. When I use Lasso, I saw two different methods to calculate the mean square error. But they do give me quite different results, so I would want to know if I'm doing anything wrong or if it just means that one method is better than the other ? # Needs the following R packages. library(lasso2) library(glmnet) # Gets the prostate cancer dataset data(Prostate) # Defines the Mean Square Error