CNTK: Define a custom loss function (Sørensen-Dice coefficient)

╄→гoц情女王★ 提交于 2019-12-06 13:16:49

问题


I'd like to use Wiki: Sørensen–Dice coefficient as a loss function in CNTK/Python. How can I define a custom loss function.


回答1:


To answer your more general question "How can I define a custom loss function:"

In CNTK, loss functions are not special. Any expression that results in a scalar can be used as a loss function. The learner will compute the minibatch-level loss by summing up the scalar loss values of all samples in the minibatch, and backpropagate through it like through any CNTK expression.

For example, the following is a way of defining a square-error loss:

def my_square_error(x,y):
    diff = x-y
    return times_transpose(diff, diff)

and the cross_entropy_with_softmax() loss can be written in Python like this:

def my_cross_entropy_with_softmax(output, labels):
    logZ = reduce_log_sum(output)  # log of softmax denominator
    return times_transpose(labels, output) - logZ

Lastly, multi-task learning can be trivially realized by using a loss function that is a weighted sum over multiple losses.




回答2:


import numpy as np
import cntk as C

def dice_coefficient(x, y):
    # https://en.wikipedia.org/wiki/S%C3%B8rensen%E2%80%93Dice_coefficient
    intersection = C.reduce_sum(C.element_times(x, y))

    return 2 * intersection / (C.reduce_sum(x) + C.reduce_sum(y))

shape = (1, 2, 2)

x1 = np.ones(shape)
y1 = np.reshape([0, 1, 0, 1], shape)

x = C.sanitize_input(x1)
y = C.sanitize_input(y1)

dice_coefficient(x, y).eval({x: x1, y: y1})

array([ 0.66666669], dtype=float32)



来源:https://stackoverflow.com/questions/43132048/cntk-define-a-custom-loss-function-s%c3%b8rensen-dice-coefficient

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!