问题
I'm currently trying to implement a neural network with two training steps. First i want to reduce the loss_first_part function and then i want to reduce the loss_second_part.
tf.global_variable_initializer().run()
for epoch in range(nb_epochs)
if epoch < 10 :
train_step = optimizer.minimize(loss_first_part)
else :
train_step = optimizer.minimize(loss_second_part)
The problem is that the initializer should be defined after the optimizer.minimize call. Indeed i've the following error Attempting to use unintialized value betal_power
.
How can i fix this problem considering that i want my optimizer to depend on the epoch...
Thanks a lot for your help !
回答1:
I've found ! So simple...
train_step1 = optimizer.minimize(loss_first_part)
train_step2 = optimizer.minimize(loss_second_part)
tf.global_variable_initializer().run()
if ... :
sess.run(train_step1)
else :
sess.run(train_step2)
来源:https://stackoverflow.com/questions/51284144/use-different-optimizers-depending-on-a-if-statement-in-tensorflow