This title appears in the Scientific Report :
2021
Please use the identifier:
http://hdl.handle.net/2128/27368 in citations.
Please use the identifier: http://dx.doi.org/10.1109/ICMLA51294.2020.00073 in citations.
Loss Scheduling for Class-Imbalanced Image Segmentation Problems
Loss Scheduling for Class-Imbalanced Image Segmentation Problems
When training a classifier the choice of loss function heavily influences the characteristics of the resulting model. The most commonly used loss function for classification is cross entropy. In image segmentation problems where each pixel is assigned to a particular class, overlap-based losses have...
Saved in:
Personal Name(s): | Taubert, Oskar |
---|---|
Götz, Markus / Schug, Alexander / Streit, Achim (Corresponding author) | |
Contributing Institute: |
Jülich Supercomputing Center; JSC |
Imprint: |
IEEE
2020
|
Physical Description: |
426-431 |
DOI: |
10.1109/ICMLA51294.2020.00073 |
Conference: | 2020 19th IEEE International Conference on Machine Learning and Applications (ICMLA), Miami (FL), 2020-12-14 - 2020-12-17 |
Document Type: |
Contribution to a conference proceedings |
Research Program: |
Forschergruppe Schug Computational Science and Mathematical Methods |
Link: |
OpenAccess |
Publikationsportal JuSER |
Please use the identifier: http://dx.doi.org/10.1109/ICMLA51294.2020.00073 in citations.
When training a classifier the choice of loss function heavily influences the characteristics of the resulting model. The most commonly used loss function for classification is cross entropy. In image segmentation problems where each pixel is assigned to a particular class, overlap-based losses have recently been shown to improve classifier performance especially for datasets with an imbalanced class distribution. This is particu-larly relevant to segmentation because class imbalance mitigation strategies used in regular classification are often not applicable. Overlap-based losses, however, have different drawbacks. We are aiming at combining the upsides of different losses with a simple scheduling scheme during training while minimizing their downsides. Gradually transitioning from an overlap-based dice loss to cross entropy allows to reliably select a distinct minimum in the optimization landscape as a valuable alternative to results obtained from traditional unscheduled loss functions. We demonstrate the efficacy of our approach on different combinations of loss functions, datasets, and models. |