How do I switch tf.train.Optimizers during training?

喜你入骨 提交于 2019-12-05 19:10:24

Just define two optimizers and switch between them:

sgd_optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
adap_optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
...
for epoch in range(100):
  for (x, y) in zip(train_X, train_Y):
    optimizer = sgd_optimizer if epoch > 50 else adap_optimizer
    sess.run(optimizer, feed_dict={X: x, Y: y})

An optimizer only encapsulates the way to apply the gradients to the tensors, and may hold just a few own variables. The model weights are not stored in the optimizers, so you can switch them easily.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!