Keras: Add variables to progress bar

蓝咒 提交于 2019-11-28 19:21:25

问题


I'd like to monitor eg. the learning rate during training in Keras both in the progress bar and in Tensorboard. I figure there must be a way to specify which variables are logged, but there's no immediate clarification on this issue on the Keras website.

I guess it's got something to do with creating a custom Callback function, however, it should be possible to modify the already existing progress bar callback, no?


回答1:


It can be achieved via a custom metric. Take the learning rate as an example:

def get_lr_metric(optimizer):
    def lr(y_true, y_pred):
        return optimizer.lr
    return lr

x = Input((50,))
out = Dense(1, activation='sigmoid')(x)
model = Model(x, out)

optimizer = Adam(lr=0.001)
lr_metric = get_lr_metric(optimizer)
model.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=['acc', lr_metric])

# reducing the learning rate by half every 2 epochs
cbks = [LearningRateScheduler(lambda epoch: 0.001 * 0.5 ** (epoch // 2)),
        TensorBoard(write_graph=False)]
X = np.random.rand(1000, 50)
Y = np.random.randint(2, size=1000)
model.fit(X, Y, epochs=10, callbacks=cbks)

The LR will be printed in the progress bar:

Epoch 1/10
1000/1000 [==============================] - 0s 103us/step - loss: 0.8228 - acc: 0.4960 - lr: 0.0010
Epoch 2/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7305 - acc: 0.4970 - lr: 0.0010
Epoch 3/10
1000/1000 [==============================] - 0s 62us/step - loss: 0.7145 - acc: 0.4730 - lr: 5.0000e-04
Epoch 4/10
1000/1000 [==============================] - 0s 58us/step - loss: 0.7129 - acc: 0.4800 - lr: 5.0000e-04
Epoch 5/10
1000/1000 [==============================] - 0s 58us/step - loss: 0.7124 - acc: 0.4810 - lr: 2.5000e-04
Epoch 6/10
1000/1000 [==============================] - 0s 63us/step - loss: 0.7123 - acc: 0.4790 - lr: 2.5000e-04
Epoch 7/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7119 - acc: 0.4840 - lr: 1.2500e-04
Epoch 8/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7117 - acc: 0.4880 - lr: 1.2500e-04
Epoch 9/10
1000/1000 [==============================] - 0s 59us/step - loss: 0.7116 - acc: 0.4880 - lr: 6.2500e-05
Epoch 10/10
1000/1000 [==============================] - 0s 63us/step - loss: 0.7115 - acc: 0.4880 - lr: 6.2500e-05

Then, you can visualize the LR curve in TensorBoard.




回答2:


Another way (in fact encouraged one) of how to pass custom values to TensorBoard is by sublcassing the keras.callbacks.TensorBoard class. This allows you to apply custom functions to obtain desired metrics and pass them directly to TensorBoard.

Here is an example for learning rate of Adam optimizer:

class SubTensorBoard(TensorBoard):
    def __init__(self, *args, **kwargs):
        super(SubTensorBoard, self).__init__(*args, **kwargs)

    def lr_getter(self):
        # Get vals
        decay = self.model.optimizer.decay
        lr = self.model.optimizer.lr
        iters = self.model.optimizer.iterations # only this should not be const
        beta_1 = self.model.optimizer.beta_1
        beta_2 = self.model.optimizer.beta_2
        # calculate
        lr = lr * (1. / (1. + decay * K.cast(iters, K.dtype(decay))))
        t = K.cast(iters, K.floatx()) + 1
        lr_t = lr * (K.sqrt(1. - K.pow(beta_2, t)) / (1. - K.pow(beta_1, t)))
        return np.float32(K.eval(lr_t))

    def on_epoch_end(self, episode, logs = {}):
        logs.update({"lr": self.lr_getter()})
        super(SubTensorBoard, self).on_epoch_end(episode, logs)


来源:https://stackoverflow.com/questions/48198031/keras-add-variables-to-progress-bar

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!