问题
When considering the problem of classifying an input to one of 2 classes, 99% of the examples I saw used a NN with a single output and sigmoid as their activation followed by a binary cross-entropy loss. Another option that I thought of is having the last layer produce 2 outputs and use a categorical cross-entropy with C=2 classes, but I never saw it in any example. Is there any reason for that?
Thanks
回答1:
If you are using softmax
on top of the two output network you get an output that is mathematically equivalent to using a single output with sigmoid
on top.
Do the math and you'll see.
In practice, from my experience, if you look at the raw "logits" of the two outputs net (before softmax
) you'll see that one is exactly the negative of the other. This is a result of the gradients pulling exactly in the opposite direction each neuron.
Therefore, since both approaches are equivalent, the single output configuration has less parameters and requires less computations, thus it is more advantageous to use a single output with a sigmoid ob top.
来源:https://stackoverflow.com/questions/57726064/binary-cross-entropy-vs-categorical-cross-entropy-with-2-classes