Constrained-CNN losses for weakly supervised segmentation
dc.contributor.author | Kervadec, Hoel | |
dc.contributor.author | Dolz, Jose | |
dc.contributor.author | Tang, Meng | |
dc.contributor.author | Granger, Eric | |
dc.contributor.author | Boykov, Yuri | |
dc.contributor.author | Ben Ayed, Ismail | |
dc.date.accessioned | 2020-02-18T18:20:20Z | |
dc.date.available | 2020-02-18T18:20:20Z | |
dc.date.issued | 2019-05 | |
dc.description | The final publication is available at Elsevier via https://doi.org/10.1016/j.media.2019.02.009. © 2019. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/ | en |
dc.description.abstract | Weakly-supervised learning based on, e.g., partially labelled images or image-tags, is currently attracting significant attention in CNN segmentation as it can mitigate the need for full and laborious pixel/voxel annotations. Enforcing high-order (global) inequality constraints on the network output (for instance, to constrain the size of the target region) can leverage unlabeled data, guiding the training process with domain-specific knowledge. Inequality constraints are very flexible because they do not assume exact prior knowledge. However, constrained Lagrangian dual optimization has been largely avoided in deep networks, mainly for computational tractability reasons. To the best of our knowledge, the method of Pathak et al. (2015a) is the only prior work that addresses deep CNNs with linear constraints in weakly supervised segmentation. It uses the constraints to synthesize fully-labeled training masks (proposals) from weak labels, mimicking full supervision and facilitating dual optimization. We propose to introduce a differentiable penalty, which enforces inequality constraints directly in the loss function, avoiding expensive Lagrangian dual iterates and proposal generation. From constrained-optimization perspective, our simple penalty-based approach is not optimal as there is no guarantee that the constraints are satisfied. However, surprisingly, it yields substantially better results than the Lagrangian-based constrained CNNs in Pathak et al. (2015a), while reducing the computational demand for training. By annotating only a small fraction of the pixels, the proposed approach can reach a level of segmentation performance that is comparable to full supervision on three separate tasks. While our experiments focused on basic linear constraints such as the target-region size and image tags, our framework can be easily extended to other non-linear constraints, e.g., invariant shape moments (Klodt and Cremers, 2011) and other region statistics (Lim et al., 2014). Therefore, it has the potential to close the gap between weakly and fully supervised learning in semantic medical image segmentation. Our code is publicly available. | en |
dc.description.sponsorship | This work is supported by the National Science and Engineering Research Council of Canada (NSERC), discovery grant program, and by the ETS Research Chair on Artificial Intelligence in Medical Imaging | en |
dc.identifier.uri | https://doi.org/10.1016/j.media.2019.02.009 | |
dc.identifier.uri | http://hdl.handle.net/10012/15655 | |
dc.language.iso | en | en |
dc.publisher | Elsevier | en |
dc.relation.uri | https://www.github.com/LIVIAETS/SizeLoss_WSS | en |
dc.rights | Attribution-NonCommercial-NoDerivatives 4.0 International | * |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ | * |
dc.subject | deep learning | en |
dc.subject | semantic segmentation | en |
dc.subject | weakly-supervised learning | en |
dc.subject | CNN constraints | en |
dc.title | Constrained-CNN losses for weakly supervised segmentation | en |
dc.type | Article | en |
dcterms.bibliographicCitation | Kervadec, Hoel, Jose Dolz, Meng Tang, Eric Granger, Yuri Boykov, and Ismail Ben Ayed. “Constrained-CNN Losses for Weakly Supervised Segmentation.” Medical Image Analysis 54 (May 1, 2019): 88–99. https://doi.org/10.1016/j.media.2019.02.009. | en |
uws.contributor.affiliation1 | Faculty of Mathematics | en |
uws.contributor.affiliation2 | David R. Cheriton School of Computer Science | en |
uws.peerReviewStatus | Reviewed | en |
uws.scholarLevel | Faculty | en |
uws.scholarLevel | Graduate | en |
uws.typeOfResource | Text | en |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- 1-s2.0-S1361841518306145-main.pdf
- Size:
- 4.12 MB
- Format:
- Adobe Portable Document Format
- Description:
- Accepted manuscript
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 4.47 KB
- Format:
- Item-specific license agreed upon to submission
- Description: