利用残留注意力多尺度聚合全卷积网络从遥感图像中提取建筑物足迹

IF 2.2 4区 地球科学 Q3 ENVIRONMENTAL SCIENCES Journal of the Indian Society of Remote Sensing Pub Date : 2024-07-31 DOI:10.1007/s12524-024-01961-8
Nima Ahmadian, Amin Sedaghat, Nazila Mohammadi
{"title":"利用残留注意力多尺度聚合全卷积网络从遥感图像中提取建筑物足迹","authors":"Nima Ahmadian, Amin Sedaghat, Nazila Mohammadi","doi":"10.1007/s12524-024-01961-8","DOIUrl":null,"url":null,"abstract":"<p>Building footprint extraction is crucial for various applications, including disaster management, change detection, and 3D modeling. Satellite and aerial images, when combined with deep learning techniques, offer an effective means for this task. The Multi-scale Aggregation Fully Convolutional Network (MA-FCN) is an encoder-decoder model that emphasizes scale information, producing the final segmentation map by concatenating four feature maps from different stages of the decoder. To enhance segmentation accuracy, we propose two novel deep learning models: Attention MA-FCN and Residual Attention MA-FCN. Attention MA-FCN incorporates attention gates in the skip connections to emphasize relevant features, directing the model’s focus to essential areas. Residual Attention MA-FCN further integrates residual blocks into the architecture, using both attention mechanisms and residual blocks to improve stability against gradient vanishing and overfitting, thereby enabling deeper training. These models were evaluated on the WHU, Massachusetts, and Jinghai District datasets, showing superior performance compared to the original MA-FCN. Specifically, Residual Attention MA-FCN outperformed MA-FCN and Attention MA-FCN by 3.6% and 0.92% on the WHU dataset, and by 5.51% and 0.91% on the Massachusetts dataset in terms of the Intersection Over Union (IOU) metric. Additionally, Residual Attention MA-FCN surpassed MA-FCN, Attention MA-FCN, Mask-RCNN, and U-Net models on the Jinghai District dataset. Due to the significance of building footprint extraction in various applications, the results of this study indicates that the proposed methods are more accurate than the MA-FCN model with better performances in IOU and F1-score metrics.</p>","PeriodicalId":17510,"journal":{"name":"Journal of the Indian Society of Remote Sensing","volume":"214 1","pages":""},"PeriodicalIF":2.2000,"publicationDate":"2024-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Building Footprint Extraction from Remote Sensing Images with Residual Attention Multi-Scale Aggregation Fully Convolutional Network\",\"authors\":\"Nima Ahmadian, Amin Sedaghat, Nazila Mohammadi\",\"doi\":\"10.1007/s12524-024-01961-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Building footprint extraction is crucial for various applications, including disaster management, change detection, and 3D modeling. Satellite and aerial images, when combined with deep learning techniques, offer an effective means for this task. The Multi-scale Aggregation Fully Convolutional Network (MA-FCN) is an encoder-decoder model that emphasizes scale information, producing the final segmentation map by concatenating four feature maps from different stages of the decoder. To enhance segmentation accuracy, we propose two novel deep learning models: Attention MA-FCN and Residual Attention MA-FCN. Attention MA-FCN incorporates attention gates in the skip connections to emphasize relevant features, directing the model’s focus to essential areas. Residual Attention MA-FCN further integrates residual blocks into the architecture, using both attention mechanisms and residual blocks to improve stability against gradient vanishing and overfitting, thereby enabling deeper training. These models were evaluated on the WHU, Massachusetts, and Jinghai District datasets, showing superior performance compared to the original MA-FCN. Specifically, Residual Attention MA-FCN outperformed MA-FCN and Attention MA-FCN by 3.6% and 0.92% on the WHU dataset, and by 5.51% and 0.91% on the Massachusetts dataset in terms of the Intersection Over Union (IOU) metric. Additionally, Residual Attention MA-FCN surpassed MA-FCN, Attention MA-FCN, Mask-RCNN, and U-Net models on the Jinghai District dataset. Due to the significance of building footprint extraction in various applications, the results of this study indicates that the proposed methods are more accurate than the MA-FCN model with better performances in IOU and F1-score metrics.</p>\",\"PeriodicalId\":17510,\"journal\":{\"name\":\"Journal of the Indian Society of Remote Sensing\",\"volume\":\"214 1\",\"pages\":\"\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2024-07-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of the Indian Society of Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1007/s12524-024-01961-8\",\"RegionNum\":4,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Indian Society of Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s12524-024-01961-8","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

建筑物足迹提取对于灾害管理、变化检测和三维建模等各种应用至关重要。卫星和航拍图像与深度学习技术相结合,为这项任务提供了有效的手段。多尺度聚合全卷积网络(Multi-scale Aggregation Fully Convolutional Network,MA-FCN)是一种强调尺度信息的编码器-解码器模型,通过串联解码器不同阶段的四个特征图来生成最终的分割图。为了提高分割精度,我们提出了两种新型深度学习模型:注意力 MA-FCN 和残留注意力 MA-FCN。注意力 MA-FCN 在跳转连接中加入了注意力门,以强调相关特征,将模型的焦点引向重要区域。残差注意 MA-FCN 进一步将残差块集成到架构中,同时使用注意机制和残差块来提高稳定性,防止梯度消失和过度拟合,从而实现更深入的训练。在 WHU、马萨诸塞州和静海区数据集上对这些模型进行了评估,结果显示,与原始 MA-FCN 相比,这些模型的性能更优。具体来说,在 WHU 数据集上,残留注意力 MA-FCN 的性能分别比 MA-FCN 和注意力 MA-FCN 高出 3.6% 和 0.92%,在马萨诸塞州数据集上,残留注意力 MA-FCN 的性能分别比 MA-FCN 和注意力 MA-FCN 高出 5.51% 和 0.91%。此外,在静海区数据集上,Residual Attention MA-FCN 超过了 MA-FCN、Attention MA-FCN、Mask-RCNN 和 U-Net 模型。鉴于建筑足迹提取在各种应用中的重要性,本研究结果表明,所提出的方法比 MA-FCN 模型更准确,在 IOU 和 F1 分数指标上表现更好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Building Footprint Extraction from Remote Sensing Images with Residual Attention Multi-Scale Aggregation Fully Convolutional Network

Building footprint extraction is crucial for various applications, including disaster management, change detection, and 3D modeling. Satellite and aerial images, when combined with deep learning techniques, offer an effective means for this task. The Multi-scale Aggregation Fully Convolutional Network (MA-FCN) is an encoder-decoder model that emphasizes scale information, producing the final segmentation map by concatenating four feature maps from different stages of the decoder. To enhance segmentation accuracy, we propose two novel deep learning models: Attention MA-FCN and Residual Attention MA-FCN. Attention MA-FCN incorporates attention gates in the skip connections to emphasize relevant features, directing the model’s focus to essential areas. Residual Attention MA-FCN further integrates residual blocks into the architecture, using both attention mechanisms and residual blocks to improve stability against gradient vanishing and overfitting, thereby enabling deeper training. These models were evaluated on the WHU, Massachusetts, and Jinghai District datasets, showing superior performance compared to the original MA-FCN. Specifically, Residual Attention MA-FCN outperformed MA-FCN and Attention MA-FCN by 3.6% and 0.92% on the WHU dataset, and by 5.51% and 0.91% on the Massachusetts dataset in terms of the Intersection Over Union (IOU) metric. Additionally, Residual Attention MA-FCN surpassed MA-FCN, Attention MA-FCN, Mask-RCNN, and U-Net models on the Jinghai District dataset. Due to the significance of building footprint extraction in various applications, the results of this study indicates that the proposed methods are more accurate than the MA-FCN model with better performances in IOU and F1-score metrics.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of the Indian Society of Remote Sensing
Journal of the Indian Society of Remote Sensing ENVIRONMENTAL SCIENCES-REMOTE SENSING
CiteScore
4.80
自引率
8.00%
发文量
163
审稿时长
7 months
期刊介绍: The aims and scope of the Journal of the Indian Society of Remote Sensing are to help towards advancement, dissemination and application of the knowledge of Remote Sensing technology, which is deemed to include photo interpretation, photogrammetry, aerial photography, image processing, and other related technologies in the field of survey, planning and management of natural resources and other areas of application where the technology is considered to be appropriate, to promote interaction among all persons, bodies, institutions (private and/or state-owned) and industries interested in achieving advancement, dissemination and application of the technology, to encourage and undertake research in remote sensing and related technologies and to undertake and execute all acts which shall promote all or any of the aims and objectives of the Indian Society of Remote Sensing.
期刊最新文献
A Heuristic Approach of Radiometric Calibration for Ocean Colour Sensors: A Case Study Using ISRO’s Ocean Colour Monitor-2 Farmland Extraction from UAV Remote Sensing Images Based on Improved SegFormer Model Self Organizing Map based Land Cover Clustering for Decision-Level Jaccard Index and Block Activity based Pan-Sharpened Images Improved Building Extraction from Remotely Sensed Images by Integration of Encode–Decoder and Edge Enhancement Models Enhancing Change Detection Accuracy in Remote Sensing Images Through Feature Optimization and Game Theory Classifier
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1