logistic-regression

Fitting Multiple Logistic Regression with Interaction between Quantitative and Qualitative Explanatory Variables with drm function from drc package

六眼飞鱼酱① 提交于 2019-12-13 02:35:59
问题 As a follow up to this question answered by @EDi. I wonder how to fit the following glm model with drc function from drc package. GLM Code Type <- rep(x=LETTERS[1:3], each=5, times=2) Conc <- rep(rep(x=seq(from=0, to=40, by=10), times=3), 2) Rep <- factor(rep(x=1:2, each=15)) Total <- 50 Kill <- c( 10, 30, 40, 45, 38, 5, 25, 35, 40, 32, 0, 32, 38, 47, 40, 11, 33, 38, 43, 36, 4, 23, 34, 42, 34, 2, 35, 39, 46, 42 ) df <- data.frame(Type, Conc, Rep, Total, Kill) fm1 <- glm( formula = Kill/Total

Predict certain label with highest possible probability in logistic regression

天涯浪子 提交于 2019-12-12 23:51:10
问题 I am building the model, having 12 parameters and {0,1} labels using logistic regression in sklearn. I need to be very confident about label 0, I am ok if some '0' will be missclassified to 1. The purpose of this, that I would like to exclude the data from the processing if the data is classifies to 0. How can I tune the parameters? 回答1: You are basically looking for specificity, which is defined as the TN/(TN+FP) , where TN is True Negative and FP is False Positive. You can read more about

Spark: Extracting summary for a ML logistic regression model from a pipeline model

人走茶凉 提交于 2019-12-12 19:15:22
问题 I've estimated a logistic regression using pipelines. My last few lines before fitting the logistic regression: from pyspark.ml.feature import VectorAssembler from pyspark.ml.classification import LogisticRegression lr = LogisticRegression(featuresCol="lr_features", labelCol = "targetvar") # create assember to include encoded features lr_assembler = VectorAssembler(inputCols= numericColumns + [categoricalCol + "ClassVec" for categoricalCol in categoricalColumns], outputCol = "lr_features")

Logistic Regression implementation with MNIST - not converging?

放肆的年华 提交于 2019-12-12 17:55:44
问题 I hope someone can help me. I did an implementation of logistic regression from scratch (so without library, except numpy in Python). I used MNIST dataset as input, and decided to try (since I am doing binary classification) a test on only two digits: 1 and 2. My code can be found here https://github.com/michelucci/Logistic-Regression-Explained/blob/master/MNIST%20with%20Logistic%20Regression%20from%20scratch.ipynb The notebook should run on any system that have the necessary library

How to generate all first-order interaction terms for Lasso Logistic Regression?

风格不统一 提交于 2019-12-12 14:26:21
问题 Is there a way in glmnet to do first order interactions? For instance, if my X matrix was: V1 V2 V3 0 1 0 1 0 1 1 0 0 ... Is there a way to specify that it do something along the lines of `y~ V1 + V2 + V3 + V1*V2 + V2 *V3 + V1*V3' without manually creating the columns? My actual matrix is larger and would be a pain to create all first order cross products by hand. 回答1: The proper R syntax for such a formula is y~(V1+V2+V3)^2 For example set.seed(15) dd <- data.frame(V1=runif(50), V2=runif(50)

ggplot2: How to combine histogram, rug plot, and logistic regression prediction in a single graph

╄→гoц情女王★ 提交于 2019-12-12 08:13:22
问题 I am trying to plot combined graphs for logistic regressions as the function logi.hist.plot but I would like to do it using ggplot2 (aesthetic reasons). The problem is that only one of the histograms should have the scale_y_reverse(). Is there any way to specify this in a single plot (see code below) or to overlap the two histograms by using coordinates that can be passed to the previous plot? ggplot(dat) + geom_point(aes(x=ind, y=dep)) + stat_smooth(aes(x=ind, y=dep), method=glm, method.args

Error when calculating prediction error for logistic regression model

我与影子孤独终老i 提交于 2019-12-12 05:08:48
问题 I am getting the following error: $ operator is invalid for atomic vectors . I am getting the error when trying to calculate the prediction error for a logistic regression model. Here is the code and data I am using: install.packages("ElemStatLearn") library(ElemStatLearn) # training data train = vowel.train # only looking at the first two classes train.new = train[1:3] # test data test = vowel.test test.new = test[1:3] # performing the logistic regression train.new$y <- as.factor(train.new$y

What does negative log likelihood of logistic regression in theano look like?

社会主义新天地 提交于 2019-12-12 02:15:17
问题 I have been reading the theano's logistic regression tutorial. I was trying to understand how the negative log likelihood is calculated. y = ivector('y') W = dmatrix('W') b = dvector('b') input = dmatrix('inp') p_y_given_x = T.nnet.softmax(T.dot(input, W) + b) logs = T.log(self.p_y_given_x)[T.arange(y.shape[0]), y] On pretty printing printing theano.printing.pprint(logs) it returned 'AdvancedSubtensor(log(Softmax(x)), ARange(TensorConstant{0}, Constant{0}[Shape(y)], TensorConstant{1}), y)'

how to import logistic regression and kmeans pmml files into r

拥有回忆 提交于 2019-12-12 01:54:32
问题 I am looking for some guidance please on importing pmml model files into r. PMML is a predictive model markup language which allows models built in one system to be deployed in another. I have several models that have been trained on spss and saved to the xml format using pmml. They are Logistic Regression and k-means models. I have undertaken exhaustive searches for r capabilities to import pmml and am finding that there is only a rare function here and there in packages such as Arules for

coefficients of logistic regression have no attribute indices in pyspark

╄→尐↘猪︶ㄣ 提交于 2019-12-11 18:31:12
问题 I wrote this code and the coefficients are not available as a sparse vector so I can't extract the indices in order to identify the active entries of the model. lr = LogisticRegression(elasticNetParam = 1.0, featuresCol = "features", labelCol = target_var) lasso_model = lr.fit(training_full) ## Extract variables with coefficients !=0 (sparse vector) + sorting coeff = lasso_model.coefficients coeff.indices 来源: https://stackoverflow.com/questions/55143818/coefficients-of-logistic-regression