3
votes

I am trying to classify Credit Card Fraud with a Keras model. Because the dataset is imbalanced, I need to use f1_score to improve the recall.

Apparently, it is not accepting the f1 function definition. How to monitor my new metrics in each epoch? The early stopping works fine if with val_loss but not with the defined ones. I receive this message:

Train on 139554 samples, validate on 59810 samples
Epoch 1/10

7s - loss: 0.3585 - acc: 0.9887 - val_loss: 0.0560 - val_acc: 0.9989
/home/libardo/anaconda3/lib/python3.6/site-packages/keras/callbacks.py:526: RuntimeWarning: Early stopping conditioned on metric f1s which is not available. Available metrics are: val_loss,val_acc,loss,acc
(self.monitor, ','.join(list(logs.keys()))), RuntimeWarning

EarlyStopping is ignoring my custom metrics defined #10018

Remark: It was not possible for me to paste my code here. I apologize for that.

1

1 Answers

3
votes

I realize this was posted a long time back, but I found this question while searching for the same answer and eventually figured it out myself. In short, you need to remember to both define the metric for the EarlyStopping Callback and as a metric when compiling the model

OK, so you've defined your custom loss function or metric with something like this (taken from https://github.com/keras-team/keras/issues/10018 which itself was taken from https://stackoverflow.com/a/45305384/5210098):

#https://stackoverflow.com/a/45305384/5210098
def f1_metric(y_true, y_pred):

    def recall(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
        recall = (true_positives + K.epsilon()) / (possible_positives + K.epsilon())
        return recall

    def precision(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
        precision = (true_positives + K.epsilon()) / (predicted_positives + K.epsilon())
        return precision

    precision = precision(y_true, y_pred)
    recall = recall(y_true, y_pred)
    return 2*((precision*recall)/(precision+recall+K.epsilon()))

Now, to use this with your EarlyStopping Callback, you can provide it as a string like EarlyStopping(monitor='f1_metric') or, to monitor against validation use EarlyStopping(monitor='val_f1_metric') instead.

But that's not enough! If you stop there, you'll get the error you got. You need to also supply the actual function as an argument when you compile your model using model.compile(metrics=[f1_metric]). Note the lack of quotation marks -- you are referencing the function itself.

If you compile the model by including the function using the metrics keyword and also include the EarlyStopping Callback, then it should work cleanly.