I\'m using Keras to predict a time series. As standard I\'m using 20 epochs. I want to check if my model is learning well, by predicting for each one of the 20 epochs.
<I think there is a bit of a confusion here.
An epoch is only used while training the neural network, so when training stops (in this case, after the 20th epoch), then the weights correspond to the ones computed on the last epoch.
Keras prints current loss values on the validation set during training after each epoch. If the weights after each epoch are not saved, then they are lost. You can save weights for each epoch with the ModelCheckpoint callback, and then load them back with load_weights on your model.
You can compute your predictions after each training epoch by implementing an appropriate callback by subclassing Callback and calling predict on the model inside the on_epoch_end function.
Then to use it, you instantiate your callback, make a list and use it as keyword argument callbacks to model.fit.
The following code will do the desired job:
import tensorflow as tf
import keras
# define your custom callback for prediction
class PredictionCallback(tf.keras.callbacks.Callback):
def on_epoch_end(self, epoch, logs={}):
y_pred = self.model.predict(self.validation_data[0])
print('prediction: {} at epoch: {}'.format(y_pred, epoch))
# ...
# register the callback before training starts
model.fit(X_train, y_train, batch_size=32, epochs=25,
validation_data=(X_valid, y_valid),
callbacks=[PredictionCallback()])