问题
What might be the equivalent function of the following theano function in tensorflow?
Theano.tensor.nnet.categorical_crossentropy(o, y)
回答1:
For 2D tensors with probability distributions in the 2nd dimension:
def crossentropy(p_approx, p_true):
return -tf.reduce_sum(tf.multiply(p_true, tf.log(p_approx)), 1)
回答2:
I think you would want to use softmax cross-entropy loss from Tensorflow. Remember that the input to this layer is unscaled logits i.e. you cannot feed softmax output to this layer. It will give wrong results.
Another important reason to use this loss instead of a combination of softmax + categorical cross-entropy is that the softmax loss is more stable. See this loss in Caffe. Also for some discussion about stability, see this.
来源:https://stackoverflow.com/questions/43825368/equivalence-of-categorical-crossentropy-function-of-theano-in-tensorflow