{"title":"Boundary-Aware Cross-Level Multi-Scale Fusion Network for RGB-D Salient Object Detection","authors":"Zhijun Zheng;Yanbin Peng","doi":"10.1109/ACCESS.2025.3549945","DOIUrl":null,"url":null,"abstract":"Accurate salient object detection is of great importance in many computer vision applications. However, due to scale variation and complex backgrounds, achieving effective detection of objects at different scales in various scenes remains a challenging task. To address this, we propose a novel Boundary-Aware Cross-Level Multi-Scale Fusion Network (BCMNet), which enhances salient object detection by fully exploiting cross-level and multi-scale features. Specifically, we propose a Cross-Attention Fusion Module (CAFM) to fuse two modality features, generating modality fusion features. Next, a Boundary-Aware Module (BAM) combines low-level features with high-level features to learn boundary-aware features, which are integrated into each decoding unit during the decoding process. During the decoding stage, a Bidirectional Cross-Level Multi-Scale Module (BCMM) is introduced to effectively integrate cross-level features and perform multi-scale learning. Finally, the output of the BCMM, combined with boundary-aware features, generates saliency prediction maps. We conduct extensive experiments on six datasets, and the experimental results show that, compared to the state-of-the-art methods, the proposed model improves MAE, maxF, maxE, and S metrics by <inline-formula> <tex-math>$0\\sim 8$ </tex-math></inline-formula>%, <inline-formula> <tex-math>$0\\sim 1.34$ </tex-math></inline-formula>%, 0.11%~0.54%, and <inline-formula> <tex-math>$0\\sim 0.45$ </tex-math></inline-formula>%, respectively.","PeriodicalId":13079,"journal":{"name":"IEEE Access","volume":"13 ","pages":"48271-48285"},"PeriodicalIF":3.4000,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10930455","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Access","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10930455/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate salient object detection is of great importance in many computer vision applications. However, due to scale variation and complex backgrounds, achieving effective detection of objects at different scales in various scenes remains a challenging task. To address this, we propose a novel Boundary-Aware Cross-Level Multi-Scale Fusion Network (BCMNet), which enhances salient object detection by fully exploiting cross-level and multi-scale features. Specifically, we propose a Cross-Attention Fusion Module (CAFM) to fuse two modality features, generating modality fusion features. Next, a Boundary-Aware Module (BAM) combines low-level features with high-level features to learn boundary-aware features, which are integrated into each decoding unit during the decoding process. During the decoding stage, a Bidirectional Cross-Level Multi-Scale Module (BCMM) is introduced to effectively integrate cross-level features and perform multi-scale learning. Finally, the output of the BCMM, combined with boundary-aware features, generates saliency prediction maps. We conduct extensive experiments on six datasets, and the experimental results show that, compared to the state-of-the-art methods, the proposed model improves MAE, maxF, maxE, and S metrics by $0\sim 8$ %, $0\sim 1.34$ %, 0.11%~0.54%, and $0\sim 0.45$ %, respectively.
IEEE AccessCOMPUTER SCIENCE, INFORMATION SYSTEMSENGIN-ENGINEERING, ELECTRICAL & ELECTRONIC
CiteScore
9.80
自引率
7.70%
发文量
6673
审稿时长
6 weeks
期刊介绍:
IEEE Access® is a multidisciplinary, open access (OA), applications-oriented, all-electronic archival journal that continuously presents the results of original research or development across all of IEEE''s fields of interest.
IEEE Access will publish articles that are of high interest to readers, original, technically correct, and clearly presented. Supported by author publication charges (APC), its hallmarks are a rapid peer review and publication process with open access to all readers. Unlike IEEE''s traditional Transactions or Journals, reviews are "binary", in that reviewers will either Accept or Reject an article in the form it is submitted in order to achieve rapid turnaround. Especially encouraged are submissions on:
Multidisciplinary topics, or applications-oriented articles and negative results that do not fit within the scope of IEEE''s traditional journals.
Practical articles discussing new experiments or measurement techniques, interesting solutions to engineering.
Development of new or improved fabrication or manufacturing techniques.
Reviews or survey articles of new or evolving fields oriented to assist others in understanding the new area.