PS-GAN: Pseudo Supervised Generative Adversarial Network With Single Scale Retinex Embedding for Infrared and Visible Image Fusion

IF 5.3 2区 地球科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing Pub Date : 2024-12-09 DOI:10.1109/JSTARS.2024.3509684
Jin Qi;Deboch Eyob Abera;Jian Cheng
{"title":"PS-GAN: Pseudo Supervised Generative Adversarial Network With Single Scale Retinex Embedding for Infrared and Visible Image Fusion","authors":"Jin Qi;Deboch Eyob Abera;Jian Cheng","doi":"10.1109/JSTARS.2024.3509684","DOIUrl":null,"url":null,"abstract":"Currently, ground truth fusion image, fused image contrast, and naturalness are rarely considered in existing infrared and visible image fusion (IVF) methods. In this article, we proposed a pseudosupervised generative adversarial network (GAN) with single scale retinex (SSR) embedding for IVF. First, a pseudoground truth fusion image conception and its computation method was proposed to solve ground truth fusion image shortage problem. Second, a novel SSR module embedded residual GAN was designed to improve fusion image contrast and naturalness. Finally, a special dense and mixed modal inputting strategy was also proposed for better modal mixed feature extraction. Extensive experimental results on public IVF datasets verified the superior performance of our proposed approach over other representative methods. It was demonstrated that the fused image details, contrast, and naturalness were significantly improved.","PeriodicalId":13116,"journal":{"name":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","volume":"18 ","pages":"1766-1777"},"PeriodicalIF":5.3000,"publicationDate":"2024-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10783431","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10783431/","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Currently, ground truth fusion image, fused image contrast, and naturalness are rarely considered in existing infrared and visible image fusion (IVF) methods. In this article, we proposed a pseudosupervised generative adversarial network (GAN) with single scale retinex (SSR) embedding for IVF. First, a pseudoground truth fusion image conception and its computation method was proposed to solve ground truth fusion image shortage problem. Second, a novel SSR module embedded residual GAN was designed to improve fusion image contrast and naturalness. Finally, a special dense and mixed modal inputting strategy was also proposed for better modal mixed feature extraction. Extensive experimental results on public IVF datasets verified the superior performance of our proposed approach over other representative methods. It was demonstrated that the fused image details, contrast, and naturalness were significantly improved.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
PS-GAN:用于红外和可见光图像融合的单尺度视网膜嵌入伪监督生成对抗网络
目前,在现有的红外和可见光图像融合(IVF)方法中,很少考虑地真融合图像、融合图像对比度和自然度。在本文中,我们提出了一种用于体外受精的单尺度视网膜嵌入的伪监督生成对抗网络(GAN)。首先,针对地面真值融合图像不足的问题,提出了一种伪地面真值融合图像的概念及其计算方法;其次,设计了嵌入残差GAN的新型SSR模块,提高融合图像的对比度和自然度;最后,为了更好地提取模态混合特征,提出了一种特殊的密集混合模态输入策略。在公共试管婴儿数据集上的大量实验结果验证了我们提出的方法优于其他代表性方法的性能。结果表明,融合后的图像细节、对比度和自然度均有明显改善。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
9.30
自引率
10.90%
发文量
563
审稿时长
4.7 months
期刊介绍: The IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing addresses the growing field of applications in Earth observations and remote sensing, and also provides a venue for the rapidly expanding special issues that are being sponsored by the IEEE Geosciences and Remote Sensing Society. The journal draws upon the experience of the highly successful “IEEE Transactions on Geoscience and Remote Sensing” and provide a complementary medium for the wide range of topics in applied earth observations. The ‘Applications’ areas encompasses the societal benefit areas of the Global Earth Observations Systems of Systems (GEOSS) program. Through deliberations over two years, ministers from 50 countries agreed to identify nine areas where Earth observation could positively impact the quality of life and health of their respective countries. Some of these are areas not traditionally addressed in the IEEE context. These include biodiversity, health and climate. Yet it is the skill sets of IEEE members, in areas such as observations, communications, computers, signal processing, standards and ocean engineering, that form the technical underpinnings of GEOSS. Thus, the Journal attracts a broad range of interests that serves both present members in new ways and expands the IEEE visibility into new areas.
期刊最新文献
Spatial–Frequency Feature Coupling Network for Semantic Segmentation of Remote Sensing Images SAM2-MDESD: An SAM2-Assisted Multilevel Dual Encoder–Single Decoder Method for Optical Remote Sensing Image Change Detection Knowledge-Aware Progressive Fusion Network for Heterogeneous Remote Sensing Image Semantic Segmentation A Multiscale Attention Transformer for Martian Dust Devil Detection in Remote Sensing Imagery A Simplified Model for Simulating Complex Signals in GNSS-Reflectometry over Land
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1