confusion-matrix

Saving output of confusionMatrix as a .csv table

谁说我不能喝 提交于 2019-12-10 10:22:02
问题 I have a following code resulting in a table-like output lvs <- c("normal", "abnormal") truth <- factor(rep(lvs, times = c(86, 258)), levels = rev(lvs)) pred <- factor( c( rep(lvs, times = c(54, 32)), rep(lvs, times = c(27, 231))), levels = rev(lvs)) xtab <- table(pred, truth) library(caret) confusionMatrix(xtab) confusionMatrix(pred, truth) confusionMatrix(xtab, prevalence = 0.25) I would like to export the below part of the output as a .csv table Accuracy : 0.8285 95% CI : (0.7844, 0.8668)

How to get confusion matrix when using model.fit_generator

孤人 提交于 2019-12-08 15:26:54
问题 I am using model.fit_generator to train and get results for my binary (two class) model because I am giving input images directly from my folder. How to get confusion matrix in this case (TP, TN, FP, FN) as well because generally I use confusion_matrix command of sklearn.metrics to get it, which requires predicted , and actual labels. But here I don't have both. May be I can calculate predicted labels from predict=model.predict_generator(validation_generator) command. But I don't know how my

Bokeh heatmap from Pandas confusion matrix

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-08 07:35:59
问题 How can a Pandas DataFrame be shown as a Bokeh heatmap? https://docs.bokeh.org/en/latest/docs/user_guide/categorical.html#heat-maps shows some example, but trying to modify always only gave an empty plot. Example confusion matrix: df = pd.DataFrame([[10, 0, 1], [1, 10, 0], [1, 1, 9]], columns=['A', 'B', 'C'], index=['A', 'B', 'C']) df.index.name = 'Treatment' df.columns.name = 'Prediction' 回答1: First import packages and prepare data.frame : import pandas as pd from bokeh.io import output_file

Machine Learning Training & Test data split method

故事扮演 提交于 2019-12-06 22:37:34
I was running a random forest classification model and initially divided the data into train (80%) and test (20%). However, the prediction had too many False Positive which I think was because there was too much noise in training data, so I decided to split the data in a different method and here's how I did it. Since I thought the high False Positive was due to the noise in the train data, I made the train data to have the equal number of target variables. For example, if I have data of 10,000 rows and the target variable is 8,000 (0) and 2,000 (1), I had the training data to be a total of 4

How to plot a confusion matrix using heatmaps in R?

会有一股神秘感。 提交于 2019-12-06 03:46:55
问题 I have a confusion matrix such that: a b c d e f g h i j a 5 4 0 0 0 0 0 0 0 0 b 0 0 0 0 0 0 0 0 0 0 c 0 0 4 0 0 0 0 0 0 0 d 0 0 0 0 0 0 0 0 0 0 e 2 0 0 0 2 0 0 0 0 0 f 1 0 0 0 0 2 0 0 0 0 g 0 0 0 0 0 0 0 0 0 0 h 0 0 0 0 0 0 0 0 0 0 i 0 0 0 0 0 0 0 0 0 0 j 0 0 0 0 0 0 0 0 0 0 where the letters denote the class labels. I just need to plot the confusion matrix. I searched a couple of tools. Heatmaps in R looks like what I need. As I don't know anything about R, it is really hard to do changes

how to create confusion matrix for classification in tensorflow

本秂侑毒 提交于 2019-12-06 02:23:46
问题 I have CNN model which has 4 output nodes, and I am trying to compute the confusion matrix so that i can know the individual class accuracy. I am able to compute the overall accuracy. In the link here, Igor Valantic gave a function which can compute the confusion matrix variables. it gives me an error at correct_prediction = tf.nn.in_top_k(logits, labels, 1, name="correct_answers") and the error is TypeError: DataType float32 for attr 'T' not in list of allowed values: int32, int64 I have

Saving output of confusionMatrix as a .csv table

纵然是瞬间 提交于 2019-12-05 21:54:55
I have a following code resulting in a table-like output lvs <- c("normal", "abnormal") truth <- factor(rep(lvs, times = c(86, 258)), levels = rev(lvs)) pred <- factor( c( rep(lvs, times = c(54, 32)), rep(lvs, times = c(27, 231))), levels = rev(lvs)) xtab <- table(pred, truth) library(caret) confusionMatrix(xtab) confusionMatrix(pred, truth) confusionMatrix(xtab, prevalence = 0.25) I would like to export the below part of the output as a .csv table Accuracy : 0.8285 95% CI : (0.7844, 0.8668) No Information Rate : 0.75 P-Value [Acc > NIR] : 0.0003097 Kappa : 0.5336 Mcnemar's Test P-Value : 0

R Confusion Matrix sensitivity and specificity labeling

别来无恙 提交于 2019-12-05 05:55:09
问题 I am using R v3.3.2 and Caret 6.0.71 (i.e. latest versions) to construct a logistic regression classifier. I am using the confusionMatrix function to create stats for judging its performance. logRegConfMat <- confusionMatrix(logRegPrediction, valData[,"Seen"]) Reference 0, Prediction 0 = 30 Reference 1, Prediction 0 = 14 Reference 0, Prediction 1 = 60 Reference 1, Prediction 1 = 164 Accuracy : 0.7239 Sensitivity : 0.3333 Specificity : 0.9213 The target value in my data (Seen) uses 1 for true

True Positive Rate and False Positive Rate (TPR, FPR) for Multi-Class Data in python

南笙酒味 提交于 2019-12-04 14:16:14
问题 How do you compute the true- and false- positive rates of a multi-class classification problem? Say, y_true = [1, -1, 0, 0, 1, -1, 1, 0, -1, 0, 1, -1, 1, 0, 0, -1, 0] y_prediction = [-1, -1, 1, 0, 0, 0, 0, -1, 1, -1, 1, 1, 0, 0, 1, 1, -1] The confusion matrix is computed by metrics.confusion_matrix(y_true, y_prediction) , but that just shifts the problem. EDIT after @seralouk's answer. Here, the class -1 is to be considered as the negatives, while 0 and 1 are variations of positives. 回答1:

how to create confusion matrix for classification in tensorflow

穿精又带淫゛_ 提交于 2019-12-04 08:30:31
I have CNN model which has 4 output nodes, and I am trying to compute the confusion matrix so that i can know the individual class accuracy. I am able to compute the overall accuracy. In the link here , Igor Valantic gave a function which can compute the confusion matrix variables. it gives me an error at correct_prediction = tf.nn.in_top_k(logits, labels, 1, name="correct_answers") and the error is TypeError: DataType float32 for attr 'T' not in list of allowed values: int32, int64 I have tried typecasting logits to int32 inside function mentioned def evaluation(logits, labels) , it gives