keras combining two losses with adjustable weights

前端 未结 1 1646
抹茶落季
抹茶落季 2020-12-30 12:56

So here is the detail description. I have a keras functional model with two layers with outputs x1 and x2.

x1 = Dense(1,activation=\'relu\')(prev_in         


        
相关标签:
1条回答
  • 2020-12-30 13:22

    It seems that propagating the "same loss" into both branches will not take effect, unless alpha is dependent on both branches. If alpha is not variable depending on both branches, then part of the loss will be just constant to one branch.

    So, in this case, just compile the model with the two losses separate and add the weights to the compile method:

    model.compile(optmizer='someOptimizer',loss=[loss1,loss2],loss_weights=[alpha,1-alpha])
    

    Compile again when you need alpha to change.


    But if indeed alpha is dependent on both branches, then you need to concatenate the results and calculate alpha's value:

    singleOut = Concatenate()([x1,x2])
    

    And a custom loss function:

    def weightedLoss(yTrue,yPred):
        x1True = yTrue[0]
        x2True = yTrue[1:]
    
        x1Pred = yPred[0]
        x2Pred = yPred[1:]
    
        #calculate alpha somehow with keras backend functions
    
        return (alpha*(someLoss(x1True,x1Pred)) + ((1-alpha)*(someLoss(x2True,x2Pred))
    

    Compile with this function:

    model.compile(loss=weightedLoss, optimizer=....)
    
    0 讨论(0)
提交回复
热议问题