Keras: how to output learning rate onto tensorboard

后端 未结 4 503
無奈伤痛
無奈伤痛 2021-02-05 10:29

I add a callback to decay learning rate:

 keras.callbacks.ReduceLROnPlateau(monitor=\'val_loss\', factor=0.5, patience=100, 
                                   v         


        
4条回答
  •  长情又很酷
    2021-02-05 11:19

    Note that with the current nightly version of tf (2.5 - probably earlier) learning rates using LearningRateSchedule are automatically added to tensorboard's logs. The following solution is only necessary if you're adapting the learning rate some other way - e.g. via ReduceLROnPlateau or LearningRateScheduler (different to LearningRateSchedule) callbacks.

    While extending tf.keras.callbacks.TensorBoard is a viable option, I prefer composition over subclassing.

    class LearningRateLogger(tf.keras.callbacks.Callback):
        def __init__(self):
            super().__init__()
            self._supports_tf_logs = True
    
        def on_epoch_end(self, epoch, logs=None):
            if logs is None or "learning_rate" in logs:
                return
            logs["learning_rate"] = self.model.optimizer.lr
    

    This allows us to compose multiple similar callbacks, and use the logged learning rate in multiple other callbacks (e.g. if you add a CSVLogger it should also write the learning rate values to file).

    Then in model.fit

    model.fit(
        callbacks=[
            LearningRateLogger(),
            # other callbacks that update `logs`
            tf.keras.callbacks.TensorBoard(path),
            # other callbacks that use updated logs, e.g. CSVLogger
        ],
        **kwargs
    )
    

提交回复
热议问题