Why is my confusion matrix returning only one number?

天大地大妈咪最大 提交于 2020-12-13 06:30:31

问题


I'm doing a binary classification. Whenever my prediction equals the ground truth, I find sklearn.metrics.confusion_matrix to return a single value. Isn't there a problem?

from sklearn.metrics import confusion_matrix
print(confusion_matrix([True, True], [True, True])
# [[2]]

I would expect something like:

[[2 0]
 [0 0]]

回答1:


Solution: Should you want to have the desired output, you should fill-in labels=[True, False]:

from sklearn.metrics import confusion_matrix

cm = confusion_matrix(y_true=[True, True], y_pred=[True, True], labels=[True, False])
print(cm)

# [[2 0]
#  [0 0]]

Explanation: From the docs, the output of confusion_matrix(y_true, y_pred) is:

C: ndarray of shape (n_classes, n_classes)

The variable n_classes is either:

  • guessed as the number of unique values in y_true or y_pred
  • taken from the length of optional parameters labels

In your case, because you did not fill in labels, the variable n_classes is guessed from the number of unique values in [True, True] which is 1. Hence the result.




回答2:


You are getting 1 number because you didn't specify your labels. I think this code will help you out to get true_positives, true negative, precesion, recall, f1-score etc

Code

from sklearn import metrics

A     = [True, True]
B     = [True, True]
label = [True, False] 


# True positives and False Negatives
cm = metrics.confusion_matrix(y_true=A, y_pred=B, labels=label)
print(cm)

# Report (f1-score, precesion, recall, support)
report = metrics.classification_report([True, False], [False, True], digits=3,output_dict=True)
df = pd.DataFrame(report).transpose()
df

Results

[[2 0]
 [0 0]]



来源:https://stackoverflow.com/questions/65248401/why-is-my-confusion-matrix-returning-only-one-number

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!