基于高效全卷积注意块CNN的SAR目标识别

IF 4 3区 地球科学 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Geoscience and Remote Sensing Letters Pub Date : 2022-01-01 DOI:10.1109/LGRS.2020.3037256
Rui Li, Xiaodan Wang, Jian Wang, Yafei Song, Lei Lei
{"title":"基于高效全卷积注意块CNN的SAR目标识别","authors":"Rui Li, Xiaodan Wang, Jian Wang, Yafei Song, Lei Lei","doi":"10.1109/LGRS.2020.3037256","DOIUrl":null,"url":null,"abstract":"Attention mechanisms have recently shown strong potential in improving the performance of convolutional neural networks (CNNs). This letter proposes a fully convolutional attention block (FCAB) that can be combined with a CNN to refine important features and suppress unnecessary ones in synthetic aperture radar (SAR) images. The FCAB consists of a channel attention module and a spatial attention module. For the channel attention module, we use average-pooling and max-pooling to learn complementary features, and apply group convolution to aggregate the information of the two types of channels. Global average-pooling is then used to encode the channel-wise importance. For the spatial attention module, the average-pooling and max-pooling along the channel axis are used to generate two spatial feature maps, and then two very lightweight convolutional layers are used to encode the spatial weight map. Experimental results on SAR images demonstrate that our FCAB can focus on important channels and object regions. It uses relatively few parameters and is computationally efficient, while bringing about significant performance gain for SAR recognition.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"19 1","pages":"1-5"},"PeriodicalIF":4.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/LGRS.2020.3037256","citationCount":"15","resultStr":"{\"title\":\"SAR Target Recognition Based on Efficient Fully Convolutional Attention Block CNN\",\"authors\":\"Rui Li, Xiaodan Wang, Jian Wang, Yafei Song, Lei Lei\",\"doi\":\"10.1109/LGRS.2020.3037256\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Attention mechanisms have recently shown strong potential in improving the performance of convolutional neural networks (CNNs). This letter proposes a fully convolutional attention block (FCAB) that can be combined with a CNN to refine important features and suppress unnecessary ones in synthetic aperture radar (SAR) images. The FCAB consists of a channel attention module and a spatial attention module. For the channel attention module, we use average-pooling and max-pooling to learn complementary features, and apply group convolution to aggregate the information of the two types of channels. Global average-pooling is then used to encode the channel-wise importance. For the spatial attention module, the average-pooling and max-pooling along the channel axis are used to generate two spatial feature maps, and then two very lightweight convolutional layers are used to encode the spatial weight map. Experimental results on SAR images demonstrate that our FCAB can focus on important channels and object regions. It uses relatively few parameters and is computationally efficient, while bringing about significant performance gain for SAR recognition.\",\"PeriodicalId\":13046,\"journal\":{\"name\":\"IEEE Geoscience and Remote Sensing Letters\",\"volume\":\"19 1\",\"pages\":\"1-5\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2022-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1109/LGRS.2020.3037256\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Geoscience and Remote Sensing Letters\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1109/LGRS.2020.3037256\",\"RegionNum\":3,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Geoscience and Remote Sensing Letters","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/LGRS.2020.3037256","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 15

摘要

注意机制最近在提高卷积神经网络(cnn)的性能方面显示出强大的潜力。这封信提出了一种全卷积注意块(FCAB),它可以与CNN相结合,以精炼合成孔径雷达(SAR)图像中的重要特征并抑制不必要的特征。FCAB由信道注意模块和空间注意模块组成。对于通道关注模块,我们使用平均池化和最大池化学习互补特征,并使用组卷积对两类通道的信息进行聚合。然后使用全局平均池来编码通道重要性。对于空间注意力模块,采用沿通道轴的平均池化和最大池化生成两个空间特征图,然后使用两个非常轻量级的卷积层对空间权重图进行编码。在SAR图像上的实验结果表明,FCAB能够对重要通道和目标区域进行聚焦。它使用相对较少的参数,计算效率高,同时为SAR识别带来显著的性能提升。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
SAR Target Recognition Based on Efficient Fully Convolutional Attention Block CNN
Attention mechanisms have recently shown strong potential in improving the performance of convolutional neural networks (CNNs). This letter proposes a fully convolutional attention block (FCAB) that can be combined with a CNN to refine important features and suppress unnecessary ones in synthetic aperture radar (SAR) images. The FCAB consists of a channel attention module and a spatial attention module. For the channel attention module, we use average-pooling and max-pooling to learn complementary features, and apply group convolution to aggregate the information of the two types of channels. Global average-pooling is then used to encode the channel-wise importance. For the spatial attention module, the average-pooling and max-pooling along the channel axis are used to generate two spatial feature maps, and then two very lightweight convolutional layers are used to encode the spatial weight map. Experimental results on SAR images demonstrate that our FCAB can focus on important channels and object regions. It uses relatively few parameters and is computationally efficient, while bringing about significant performance gain for SAR recognition.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Geoscience and Remote Sensing Letters
IEEE Geoscience and Remote Sensing Letters 工程技术-地球化学与地球物理
CiteScore
7.60
自引率
12.50%
发文量
1113
审稿时长
3.4 months
期刊介绍: IEEE Geoscience and Remote Sensing Letters (GRSL) is a monthly publication for short papers (maximum length 5 pages) addressing new ideas and formative concepts in remote sensing as well as important new and timely results and concepts. Papers should relate to the theory, concepts and techniques of science and engineering as applied to sensing the earth, oceans, atmosphere, and space, and the processing, interpretation, and dissemination of this information. The technical content of papers must be both new and significant. Experimental data must be complete and include sufficient description of experimental apparatus, methods, and relevant experimental conditions. GRSL encourages the incorporation of "extended objects" or "multimedia" such as animations to enhance the shorter papers.
期刊最新文献
A “Difference In Difference” based method for unsupervised change detection in season-varying images AccuLiteFastNet: A Remote Sensing Object Detection Model Combining High Accuracy, Lightweight Design, and Fast Inference Speed Monitoring ten insect pests in selected orchards in three Azorean Islands: The project CUARENTAGRI. Maritime Radar Target Detection in Sea Clutter Based on CNN With Dual-Perspective Attention A Semantics-Geometry Framework for Road Extraction From Remote Sensing Images
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1