Jing Wang, Dongmei Jia, Jiaxing Xue, Zhongwu Wu, Wanying Song
{"title":"Automatic Water Body Extraction from SAR Images Based on MADF-Net","authors":"Jing Wang, Dongmei Jia, Jiaxing Xue, Zhongwu Wu, Wanying Song","doi":"10.3390/rs16183419","DOIUrl":null,"url":null,"abstract":"Water extraction from synthetic aperture radar (SAR) images has an important application value in wetland monitoring, flood monitoring, etc. However, it still faces the problems of low generalization, weak extraction ability of detailed information, and weak suppression of background noises. Therefore, a new framework, Multi-scale Attention Detailed Feature fusion Network (MADF-Net), is proposed in this paper. It comprises an encoder and a decoder. In the encoder, ResNet101 is used as a solid backbone network to capture four feature levels at different depths, and then the proposed Deep Pyramid Pool (DAPP) module is used to perform multi-scale pooling operations, which ensure that key water features can be captured even in complex backgrounds. In the decoder, a Channel Spatial Attention Module (CSAM) is proposed, which focuses on feature areas that are critical for the identification of water edges by fusing attention weights in channel and spatial dimensions. Finally, the high-level semantic information is effectively fused with the low-level edge features to achieve the final water detection results. In the experiment, Sentinel-1 SAR images of three scenes with different characteristics and scales of water body are used. The PA and IoU of water extraction by MADF-Net can reach 92.77% and 89.03%, respectively, which obviously outperform several other networks. MADF-Net carries out water extraction with high precision from SAR images with different backgrounds, which could also be used for the segmentation and classification of other tasks from SAR images.","PeriodicalId":48993,"journal":{"name":"Remote Sensing","volume":"163 1","pages":""},"PeriodicalIF":4.2000,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/rs16183419","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Water extraction from synthetic aperture radar (SAR) images has an important application value in wetland monitoring, flood monitoring, etc. However, it still faces the problems of low generalization, weak extraction ability of detailed information, and weak suppression of background noises. Therefore, a new framework, Multi-scale Attention Detailed Feature fusion Network (MADF-Net), is proposed in this paper. It comprises an encoder and a decoder. In the encoder, ResNet101 is used as a solid backbone network to capture four feature levels at different depths, and then the proposed Deep Pyramid Pool (DAPP) module is used to perform multi-scale pooling operations, which ensure that key water features can be captured even in complex backgrounds. In the decoder, a Channel Spatial Attention Module (CSAM) is proposed, which focuses on feature areas that are critical for the identification of water edges by fusing attention weights in channel and spatial dimensions. Finally, the high-level semantic information is effectively fused with the low-level edge features to achieve the final water detection results. In the experiment, Sentinel-1 SAR images of three scenes with different characteristics and scales of water body are used. The PA and IoU of water extraction by MADF-Net can reach 92.77% and 89.03%, respectively, which obviously outperform several other networks. MADF-Net carries out water extraction with high precision from SAR images with different backgrounds, which could also be used for the segmentation and classification of other tasks from SAR images.
期刊介绍:
Remote Sensing (ISSN 2072-4292) publishes regular research papers, reviews, letters and communications covering all aspects of the remote sensing process, from instrument design and signal processing to the retrieval of geophysical parameters and their application in geosciences. Our aim is to encourage scientists to publish experimental, theoretical and computational results in as much detail as possible so that results can be easily reproduced. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.