Residential Collegefalse
Status已發表Published
Psanet: prototype-guided salient attention for few-shot segmentation
Li, Hao1; Huang, Guoheng1; Yuan, Xiaochen2; Zheng, Zewen1; Chen, Xuhang3; Zhong, Guo4; Pun, Chi Man5
2024
Source PublicationVisual Computer
ISSN0178-2789
Abstract

Few-shot semantic segmentation aims to learn a generalized model for unseen-class segmentation with just a few densely annotated samples. Most current metric-based prototype learning models utilize prototypes to assist in query sample segmentation by directly utilizing support samples through Masked Average Pooling. However, these methods frequently fail to consider the semantic ambiguity of prototypes, the limitations in performance when dealing with extreme variations in objects, and the semantic similarities between different classes. In this paper, we introduce a novel network architecture named Prototype-guided Salient Attention Network (PSANet). Specifically, we employ prototype-guided attention to learn salient regions, allocating different attention weights to features at different spatial locations of the target to enhance the significance of salient regions within the prototype. In order to mitigate the impact of external distractor categories on the prototype, our proposed contrastive loss has the capability to acquire a more discriminative prototype to promote inter-class feature separation and intra-class feature compactness. Moreover, we suggest implementing a refinement operation for the multi-scale module in order to enhance the ability to capture complete contextual information regarding features at various scales. The effectiveness of our strategy is demonstrated by extensive tests performed on the PASCAL-5 and COCO-20 datasets, despite its inherent simplicity. Our code is available at https://github.com/woaixuexixuexi/PSANet.

KeywordAttention Mechanism Contrastive Learning Few-shot Segmentation Semantic Segmentation
DOI10.1007/s00371-024-03582-1
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Software Engineering
WOS IDWOS:001282038800001
PublisherSPRINGERONE NEW YORK PLAZA, SUITE 4600 , NEW YORK, NY 10004, UNITED STATES
Scopus ID2-s2.0-85200143637
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Corresponding AuthorHuang, Guoheng; Zhong, Guo
Affiliation1.Guangdong University of Technology, Guangzhou, China
2.Macao Polytechnic University, Macao
3.Huizhou University, Huizhou, China
4.Guangdong University of Foreign Studies, Guangzhou, China
5.University of Macau, Macao
Recommended Citation
GB/T 7714
Li, Hao,Huang, Guoheng,Yuan, Xiaochen,et al. Psanet: prototype-guided salient attention for few-shot segmentation[J]. Visual Computer, 2024.
APA Li, Hao., Huang, Guoheng., Yuan, Xiaochen., Zheng, Zewen., Chen, Xuhang., Zhong, Guo., & Pun, Chi Man (2024). Psanet: prototype-guided salient attention for few-shot segmentation. Visual Computer.
MLA Li, Hao,et al."Psanet: prototype-guided salient attention for few-shot segmentation".Visual Computer (2024).
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Li, Hao]'s Articles
[Huang, Guoheng]'s Articles
[Yuan, Xiaochen]'s Articles
Baidu academic
Similar articles in Baidu academic
[Li, Hao]'s Articles
[Huang, Guoheng]'s Articles
[Yuan, Xiaochen]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Li, Hao]'s Articles
[Huang, Guoheng]'s Articles
[Yuan, Xiaochen]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.