Dual-branch multi-modal convergence network for crater detection using Chang’e image

Feng Lin , Xie Hu , Yiling Lin , Yao Li , Yang Liu , Dongmei Li
{"title":"Dual-branch multi-modal convergence network for crater detection using Chang’e image","authors":"Feng Lin ,&nbsp;Xie Hu ,&nbsp;Yiling Lin ,&nbsp;Yao Li ,&nbsp;Yang Liu ,&nbsp;Dongmei Li","doi":"10.1016/j.jag.2024.104215","DOIUrl":null,"url":null,"abstract":"<div><div>Knowledge about the impact craters on rocky planets is crucial for understanding the evolutionary history of the universe. Compared to traditional visual interpretation, deep learning approaches have improved the efficiency of crater detection. However, single-source data and divergent data quality limit the accuracy of crater detection. In this study, we focus on valuable features in multi-modal remote sensing data from Chang’e lunar exploration mission and propose an Attention-based Dual-branch Segmentation Network (ADSNet). First, we use ADSNet to extract the multi-modal features via a dual-branch encoder. Second, we introduce a novel attention for data fusion where the features from the auxiliary modality are weighted by a scoring function and then being fused with those from the primary modality. After fusion, the features are transferred to the decoder through skip connection. Lastly, high-accuracy crater detection is achieved based on the learned multi-modal data features through semantic segmentation. Our results demonstrate that ADSNet outperforms other baseline models in many metrics such as IoU and F1 score. ADSNet is an effective approach to leverage multi-modal remote sensing data in geomorphological feature detection on rocky planets in general.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"134 ","pages":"Article 104215"},"PeriodicalIF":7.6000,"publicationDate":"2024-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843224005715","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

Abstract

Knowledge about the impact craters on rocky planets is crucial for understanding the evolutionary history of the universe. Compared to traditional visual interpretation, deep learning approaches have improved the efficiency of crater detection. However, single-source data and divergent data quality limit the accuracy of crater detection. In this study, we focus on valuable features in multi-modal remote sensing data from Chang’e lunar exploration mission and propose an Attention-based Dual-branch Segmentation Network (ADSNet). First, we use ADSNet to extract the multi-modal features via a dual-branch encoder. Second, we introduce a novel attention for data fusion where the features from the auxiliary modality are weighted by a scoring function and then being fused with those from the primary modality. After fusion, the features are transferred to the decoder through skip connection. Lastly, high-accuracy crater detection is achieved based on the learned multi-modal data features through semantic segmentation. Our results demonstrate that ADSNet outperforms other baseline models in many metrics such as IoU and F1 score. ADSNet is an effective approach to leverage multi-modal remote sensing data in geomorphological feature detection on rocky planets in general.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用嫦娥图像探测陨石坑的双分支多模态融合网络
了解岩石行星上的撞击坑对于了解宇宙的演化历史至关重要。与传统的视觉判读相比,深度学习方法提高了陨石坑检测的效率。然而,单一来源的数据和不同的数据质量限制了陨石坑检测的准确性。在本研究中,我们聚焦嫦娥探月任务多模态遥感数据中的有价值特征,提出了基于注意力的双分支分割网络(ADSNet)。首先,我们使用 ADSNet 通过双分支编码器提取多模态特征。其次,我们为数据融合引入了一种新的注意力,即通过评分函数对来自辅助模态的特征进行加权,然后与来自主模态的特征进行融合。融合后,通过跳接将特征传输到解码器。最后,根据学习到的多模态数据特征,通过语义分割实现高精度的火山口检测。我们的研究结果表明,ADSNet 在 IoU 和 F1 分数等许多指标上都优于其他基线模型。ADSNet 是利用多模态遥感数据进行岩质行星地貌特征检测的有效方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International journal of applied earth observation and geoinformation : ITC journal
International journal of applied earth observation and geoinformation : ITC journal Global and Planetary Change, Management, Monitoring, Policy and Law, Earth-Surface Processes, Computers in Earth Sciences
CiteScore
12.00
自引率
0.00%
发文量
0
审稿时长
77 days
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
期刊最新文献
Back to geometry: Efficient indoor space segmentation from point clouds by 2D–3D geometry constrains Fine-scale retrieval of leaf chlorophyll content using a semi-empirically accelerated 3D radiative transfer model Improved early detection of wheat stripe rust through integration pigments and pigment-related spectral indices quantified from UAV hyperspectral imagery GNSS-denied geolocalization of UAVs using terrain-weighted constraint optimization Investigating overlapping deformation patterns of the Beijing Plain by independent component analysis of InSAR observations
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1