How to add learning rate to summaries?

依然范特西╮ 提交于 2020-01-15 05:17:22

问题


How do I monitor learning rate of AdamOptimizer? In TensorBoard: Visualizing Learning is said that I need

Collect these by attaching scalar_summary ops to the nodes that output the learning rate and loss respectively.

How can I do this?


回答1:


I think something like following inside the graph would work fine:

with tf.name_scope("learning_rate"):
    global_step = tf.Variable(0)
    decay_steps = 1000 # setup your decay step
    decay_rate = .95 # setup your decay rate
    learning_rate = tf.train.exponential_decay(0.01, global_step, decay_steps, decay_rate, staircase=True, "learning_rate")
tf.scalar_summary('learning_rate', learning_rate)

(Of course to make it work, it'd require to tf.merge_all_summaries() and use tf.train.SummaryWriter to write the summaries to the log in the end)



来源:https://stackoverflow.com/questions/40752053/how-to-add-learning-rate-to-summaries

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!