Tensorflow 2.0 Custom loss function with multiple inputs

后端 未结 3 1548
谎友^
谎友^ 2021-02-15 18:11

I am trying to optimize a model with the following two loss functions

def loss_1(pred, weights, logits):
    weighted_sparse_ce = kls.SparseCategoricalCrossentro         


        
3条回答
  •  清歌不尽
    2021-02-15 18:30

    In tf 1.x we have tf.nn.weighted_cross_entropy_with_logits function which allows us trade off recall and precision by adding extra positive weights for each class. In multi-label classification, it should be a (N,) tensor or numpy array. However, in tf 2.0, I haven't found similar loss functions yet, so I wrote my own loss function with extra arguments pos_w_arr.

    from tensorflow.keras.backend import epsilon
    
    def pos_w_loss(pos_w_arr):
        """
        Define positive weighted loss function
        """
        def fn(y_true, y_pred):
            _epsilon = tf.convert_to_tensor(epsilon(), dtype=y_pred.dtype.base_dtype)
            _y_pred = tf.clip_by_value(y_pred, _epsilon, 1. - _epsilon)
            cost = tf.multiply(tf.multiply(y_true, tf.math.log(
                _y_pred)), pos_w_arr)+tf.multiply((1-y_true), tf.math.log(1-_y_pred))
            return -tf.reduce_mean(cost)
        return fn
    

    Not sure what do you mean it wouldn't work when using eager tensors or numpy array as inputs though. Please correct me if I'm wrong.

提交回复
热议问题