What is the difference between cross-entropy and log loss error?

后端 未结 1 486
隐瞒了意图╮
隐瞒了意图╮ 2020-12-14 09:50

What is the difference between cross-entropy and log loss error? The formulae for both seem to be very similar.

相关标签:
1条回答
  • 2020-12-14 10:47

    They are essentially the same; usually, we use the term log loss for binary classification problems, and the more general cross-entropy (loss) for the general case of multi-class classification, but even this distinction is not consistent, and you'll often find the terms used interchangeably as synonyms.

    From the Wikipedia entry for cross-entropy:

    The logistic loss is sometimes called cross-entropy loss. It is also known as log loss

    From the fast.ai wiki entry on log loss:

    Log loss and cross-entropy are slightly different depending on the context, but in machine learning when calculating error rates between 0 and 1 they resolve to the same thing.

    From the ML Cheatsheet:

    Cross-entropy loss, or log loss, measures the performance of a classification model whose output is a probability value between 0 and 1.

    0 讨论(0)
提交回复
热议问题