Learning pseudo labels for semi-and-weakly supervised semantic segmentation. (December 2022)
- Record Type:
- Journal Article
- Title:
- Learning pseudo labels for semi-and-weakly supervised semantic segmentation. (December 2022)
- Main Title:
- Learning pseudo labels for semi-and-weakly supervised semantic segmentation
- Authors:
- Wang, Yude
Zhang, Jie
Kan, Meina
Shan, Shiguang - Abstract:
- Highlights: We improve the semi-and-weakly supervised semantic segmentation via learning high-quality pseudo labels. A simpler learning target for pseudo label generation is introduced to void overfitting. The interaction between two networks progressively produces additional self-supervision to improve representation learning. Our method outperforms the state-of-the-art methods significantly. Abstract: In this paper, we aim to tackle semi-and-weakly supervised semantic segmentation (SWSSS), where many image-level classification labels and a few pixel-level annotations are available. We believe the most crucial point for solving SWSSS is to produce high-quality pseudo labels, and our method deals with it from two perspectives. Firstly, we introduce a class-aware cross entropy (CCE) loss for network training. Compared to conventional cross entropy loss, CCE loss encourages the model to distinguish concurrent classes only and simplifies the learning target of pseudo label generation. Secondly, we propose a progressive cross training (PCT) method to build cross supervision between two networks with a dynamic evaluation mechanism, which progressively introduces high-quality predictions as additional supervision for network training. Our method significantly improves the quality of generated pseudo labels in the regime with extremely limited annotations. Extensive experiments demonstrate that our approach outperforms state-of-the-art methods significantly. The code is releasedHighlights: We improve the semi-and-weakly supervised semantic segmentation via learning high-quality pseudo labels. A simpler learning target for pseudo label generation is introduced to void overfitting. The interaction between two networks progressively produces additional self-supervision to improve representation learning. Our method outperforms the state-of-the-art methods significantly. Abstract: In this paper, we aim to tackle semi-and-weakly supervised semantic segmentation (SWSSS), where many image-level classification labels and a few pixel-level annotations are available. We believe the most crucial point for solving SWSSS is to produce high-quality pseudo labels, and our method deals with it from two perspectives. Firstly, we introduce a class-aware cross entropy (CCE) loss for network training. Compared to conventional cross entropy loss, CCE loss encourages the model to distinguish concurrent classes only and simplifies the learning target of pseudo label generation. Secondly, we propose a progressive cross training (PCT) method to build cross supervision between two networks with a dynamic evaluation mechanism, which progressively introduces high-quality predictions as additional supervision for network training. Our method significantly improves the quality of generated pseudo labels in the regime with extremely limited annotations. Extensive experiments demonstrate that our approach outperforms state-of-the-art methods significantly. The code is released for public access 1 . … (more)
- Is Part Of:
- Pattern recognition. Volume 132(2022)
- Journal:
- Pattern recognition
- Issue:
- Volume 132(2022)
- Issue Display:
- Volume 132, Issue 2022 (2022)
- Year:
- 2022
- Volume:
- 132
- Issue:
- 2022
- Issue Sort Value:
- 2022-0132-2022-0000
- Page Start:
- Page End:
- Publication Date:
- 2022-12
- Subjects:
- Semi-supervised -- Weakly supervised -- Semi-and-weakly supervised -- Semantic segmentation -- Pseudo label -- Self-training
Pattern perception -- Periodicals
Perception des structures -- Périodiques
Patroonherkenning
006.4 - Journal URLs:
- http://www.sciencedirect.com/science/journal/00313203 ↗
http://www.sciencedirect.com/ ↗ - DOI:
- 10.1016/j.patcog.2022.108925 ↗
- Languages:
- English
- ISSNs:
- 0031-3203
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 23281.xml