How can I use tensorboard with tf.estimator.Estimator

前端 未结 4 971
粉色の甜心
粉色の甜心 2020-12-30 06:44

I am considering to move my code base to tf.estimator.Estimator, but I cannot find an example on how to use it in combination with tensorboard summaries.

MWE:

<
相关标签:
4条回答
  • 2020-12-30 07:18

    EDIT: Upon testing (in v1.1.0, and probably in later versions as well), it is apparent that tf.estimator.Estimator will automatically write summaries for you. I confirmed this using OP's code and tensorboard.

    (Some poking around r1.4 leads me to conclude that this automatic summary writing occurs due to tf.train.MonitoredTrainingSession.)

    Ultimately, the automatic summarizing is accomplished with the use of hooks, so if you wanted to customize the Estimator's default summarizing, you could do so using hooks. Below are the (edited) details from the original answer.


    You'll want to use hooks, formerly known as monitors. (Linked is a conceptual/quickstart guide; the short of it is that the notion of hooking into / monitoring training is built into the Estimator API. A bit confusingly, though, it doesn't seem like the deprecation of monitors for hooks is really documented except in a deprecation annotation in the actual source code...)

    Based on your usage, it looks like r1.2's SummarySaverHook fits your bill.

    summary_hook = tf.train.SummarySaverHook(
        SAVE_EVERY_N_STEPS,
        output_dir='/tmp/tf',
        summary_op=tf.summary.merge_all())
    

    You may want to customize the hook's initialization parameters, as by providing an explicity SummaryWriter or writing every N seconds instead of N steps.

    If you pass this into the EstimatorSpec, you'll get your customized Summary behavior:

    return tf.estimator.EstimatorSpec(mode=mode, predictions=y,loss=loss,
                                      train_op=train,
                                      training_hooks=[summary_hook])
    

    EDIT NOTE: A previous version of this answer suggested passing the summary_hook into estimator.train(input_fn=input_fn, steps=5, hooks=[summary_hook]). This does not work because tf.summary.merge_all() has to be called in the same context as your model graph.

    0 讨论(0)
  • 2020-12-30 07:28

    estimator = tf.estimator.Estimator(model_fn=model, model_dir='/tmp/tf')

    Code model_dir='/tmp/tf' means estimator write all logs to /tmp/tf, then run tensorboard --log.dir=/tmp/tf, open you browser with url: http://localhost"6006 ,you can see the graphic

    0 讨论(0)
  • 2020-12-30 07:40

    For me this worked without adding any hooks or merge_all calls. I just added some tf.summary.image(...) in my model_fn and when I train the model they magically appear in tensorboard. Not sure what the exact mechanism is, however. I'm using TensorFlow 1.4.

    0 讨论(0)
  • 2020-12-30 07:41

    You can create a SummarySaverHook with tf.summary.merger_all() as the summary_op in the model_fn itself. Pass this hook to the training_hooks param of the EstimatorSpec constructor in your model_fn.

    I don't think what @jagthebeetle said is exactly applicable here. As the hooks that you transfer to the estimator.train method cannot be run for the summaries that you define in your model_fn, since they won't be added to the merge_all op as they remain bounded by the scope of model_fn

    0 讨论(0)
提交回复
热议问题