I am wondering how does Keras compute a metric (a custom one or not).
For example, suppose I have the following metric which yields the maximal error between the predict
Dennis has already explain this clearly.
One more thing to point out, if you want compute the metric over all train datasets, Or like your custome metric function could just be computed on single pass and no averaging, you could try use the keras callback and define the on_epoch_end, in on_epoch_end method you could compute this on whole train data.
like this :
def on_epoch_end(self, epoch, logs={}):
y_pred = self.model.predict(self.X_train, verbose=0)
score = max_error(self.y_train, y_pred)
y_val_pred = self.model.predict(self.X_val, verbose=0)
val_score = max_error(self.y_val, y_val_pred)
print("\n ROC-AUC - epoch: %d - train score: %.6f \n - val score: %.6f" % (epoch+1, score, val_score))
And you need pass the train data and val data to model.fit's validation_data parameter.