问题
I want to make following loss function in keras:
Loss = mse + double_derivative(y_pred,x_train)
I am not able to incorporate the derivative term. I have tried K.gradients(K.gradients(y_pred,x_train),x_train)
but it does not help.
I am getting error message:
AttributeError: 'NoneType' object has no attribute 'op'
def _loss_tensor(y_true, y_pred,x_train):
l1 = K.mean(K.square(y_true - y_pred), axis=-1)
sigma = 0.01
lamda = 3
term = K.square(sigma)*K.gradients(K.gradients(y_pred,x_train),x_train)
l2 = K.mean(lamda*K.square(term),axis=-1)
return l1+l2
def loss_func(x_train):
def loss(y_true,y_pred):
return _loss_tensor(y_true,y_pred,x_train)
return loss
def create_model_neural(learning_rate, num_layers,
num_nodes, activation):
model_neural = Sequential()
x_train = model_neural.add(Dense(num_nod, input_dim=num_input, activation=activation))
for i in range(num_layers-1):
model_neural.add(Dense(num_nodes,activation=activation,name=name))
model_neural.add(Dense(1, activation=activation))
optimizer = SGD(lr=learning_rate)
model_loss = loss_func(x_train=x_train)
model_neural.compile(loss=model_loss,optimizer=optimizer)
return model_neural
回答1:
The problem is that x_train
is always None
and keras can't take a derivative wrt None
. And this is happening because model_neural.add(...)
does not return anything.
I assume that x_train
is the input that is passed to the network. In this case x_train
should probably be another argument of create_model_neural
or alternatively you can try model_neural.input
tensor.
来源:https://stackoverflow.com/questions/50288258/derivative-in-loss-function-in-keras