[Feature] Early Stopping, Validation Loss
1dmesh opened this issue · 1 comments
1dmesh commented
What is the feature?
Early stopping focused on the validation loss metric would be nice. From my minimal understanding, it would be hard to do with the current system of extending BaseMetric
. At least with IoUMetric, there is no ability to early stop based on the validation loss. If this is not practical to see in deep learning, or if this functionality already exists please let me know!
For more context, see below where I have tried using EarlyStoppingHook.
Any other context?
-
loss
early_stopping=dict(type='EarlyStoppingHook', monitor='loss', min_delta=0.01, patience=10)
- UserWarning: Skip early stopping process since the evaluation results (dict_keys(['aAcc', 'mIoU', 'mAcc', 'mDice', 'mFscore', 'mPrecision', 'mRecall'])) do not include
monitor
(loss)
-
val_loss
early_stopping=dict(type='EarlyStoppingHook', monitor='val_loss', min_delta=0.01, patience=10)
- UserWarning: Skip early stopping process since the evaluation results (dict_keys(['aAcc', 'mIoU', 'mAcc', 'mDice', 'mFscore', 'mPrecision', 'mRecall'])) do not include
monitor
(val_loss)
-
Related: