问题
When I have been attempting to implement this function tf.train.stg(learningRate).minimize(loss)
into my code in order to conduct back-propagation. I have been getting multiple errors such The f passed in variableGrads(f) must be a function. How would I implement the function above into the code bellow successfully? and Why does this error even occur?
Neural Network:
var X = tf.tensor([[1,2,3], [4,5,6], [7,8,9], [10,11,12]])
var Y = tf.tensor([[0,0,0],[0,0,0], [1,1,1]])
var m = X.shape[0]
var a0 = tf.zeros([1,3])
var y_hat = tf.zeros([1,3])
var parameters = {
"Wax": tf.randomUniform([1,3]),
"Waa": tf.randomUniform([3,3]),
"ba": tf.zeros([1,3]),
"Wya": tf.randomUniform([3,3]),
"by": tf.zeros([1,3])
}
function RNN_cell_Foward(xt, a_prev, parameters){
var Wax = parameters["Wax"]
var Waa = parameters["Waa"]
var ba = parameters["ba"]
var a_next = tf.sigmoid(tf.add(tf.add(tf.matMul(xt, Wax), tf.matMul(a_prev , Waa)),ba))
return a_next
}
function RNN_FowardProp(X, a0, parameters){
var T_x = X.shape[0]
var a_next = a0
var i = 1
var Wya = parameters["Wya"]
var by = parameters["by"]
var l = 1
for(; i <= T_x; i++){
var X_i = X.slice([i-1,0],[1,-1])
for(; l <= X.shape[1]; l++){
var xt = X_i.slice([0,l-1],[1,1])
var a_next = RNN_cell_Foward(xt, a_next, parameters)
}
var y_pred = tf.sigmoid((tf.add(tf.matMul(a_next, Wya), by)))
l = 1
if (i == 1){
var y_pred1 = y_pred
} else if (i == 2) {
var y_pred2 = y_pred
} else if (i == 3) {
var y_pred3 = y_pred
}
}
var y_predx = tf.concat([y_pred1, y_pred2, y_pred3])
return y_predx
}
const learningRate = 0.01;
var optimizer = tf.train.sgd(learningRate);
var model = RNN_FowardProp(X, a0, parameters)
var loss = tf.losses.meanSquaredError(Y, model)
for (let f = 0; f < 10; f++) {
optimizer.minimize(loss)
}
This is a neural network for sentiment classification which has a many to one structure.
回答1:
The error says it all:
The f passed in variableGrads(f) must be a function
optimizer.minimize
is expecting a function as parameter and not a tensor. Since the code is trying to minimize the meanSquaredError, the argument
of minimize
can be a function that computes the meanSquaredError between the predicted value and the expected one.
const loss = (pred, label) => pred.sub(label).square().mean();
for (let f = 0; f < 10; f++) {
optimizer.minimize(() => tf.losses.meanSquaredError(Y, model))
}
Does it solve the issue, not completely yet ? The error will change for something like:
variableGrads() expects at least one of the input variables to be trainable
What does it mean ? When the optimizer is used, it expects the function passed as argument to contains variables whose values will be updated to minimize
the function output.
Here is the changes to be made:
var Y = tf.tensor([[0,0,0],[0,0,0], [1,1,1]]).variable() // a variable instead
// var loss = tf.losses.meanSquaredError(Y, model)
// computed below in the minimize function
const learningRate = 0.01;
var optimizer = tf.train.sgd(learningRate);
var model = RNN_FowardProp(X, a0, parameters);
const loss = (pred, label) => pred.sub(label).square().mean();
for (let f = 0; f < 10; f++) {
optimizer.minimize(() => tf.losses.meanSquaredError(Y, model))
}
来源:https://stackoverflow.com/questions/63407284/backpropagation-in-an-tensorflow-js-neural-network