Physical prior-guided deep fusion network with shading cues for shape from polarization

IF 14.7 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Information Fusion Pub Date : 2024-11-23 DOI:10.1016/j.inffus.2024.102805
Rui Liu , Zhiyuan Zhang , Yini Peng , Jiayi Ma , Xin Tian
{"title":"Physical prior-guided deep fusion network with shading cues for shape from polarization","authors":"Rui Liu ,&nbsp;Zhiyuan Zhang ,&nbsp;Yini Peng ,&nbsp;Jiayi Ma ,&nbsp;Xin Tian","doi":"10.1016/j.inffus.2024.102805","DOIUrl":null,"url":null,"abstract":"<div><div>Shape from polarization (SfP) is a powerful passive three-dimensional imaging technique that enables the reconstruction of surface normal with dense textural details. However, existing deep learning-based SfP methods only focus on the polarization prior, which makes it difficult to accurately reconstruct targets with rich texture details under complicated scenes. Aiming to improve the reconstruction accuracy, we utilize the surface normal estimated from shading cues and the innovatively proposed specular confidence as shading prior to provide additional feature information. Furthermore, to efficiently combine the polarization and shading priors, a novel deep fusion network named SfPSNet is proposed for the information extraction and the reconstruction of surface normal. SfPSNet is implemented based on a dual-branch architecture to handle different physical priors. A feature correction module is specifically designed to mutually rectify the defects in channel-wise and spatial-wise dimensions, respectively. In addition, a feature fusion module is proposed to fuse the feature maps of polarization and shading priors based on an efficient cross-attention mechanism. Our experimental results show that the fusion of polarization and shading priors can significantly improve the reconstruction quality of surface normal, especially for objects or scenes illuminated by complex lighting sources. As a result, SfPSNet shows state-of-the-art performance compared with existing deep learning-based SfP methods benefiting from its efficiency in extracting and fusing information from different priors.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"117 ","pages":"Article 102805"},"PeriodicalIF":14.7000,"publicationDate":"2024-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253524005839","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Shape from polarization (SfP) is a powerful passive three-dimensional imaging technique that enables the reconstruction of surface normal with dense textural details. However, existing deep learning-based SfP methods only focus on the polarization prior, which makes it difficult to accurately reconstruct targets with rich texture details under complicated scenes. Aiming to improve the reconstruction accuracy, we utilize the surface normal estimated from shading cues and the innovatively proposed specular confidence as shading prior to provide additional feature information. Furthermore, to efficiently combine the polarization and shading priors, a novel deep fusion network named SfPSNet is proposed for the information extraction and the reconstruction of surface normal. SfPSNet is implemented based on a dual-branch architecture to handle different physical priors. A feature correction module is specifically designed to mutually rectify the defects in channel-wise and spatial-wise dimensions, respectively. In addition, a feature fusion module is proposed to fuse the feature maps of polarization and shading priors based on an efficient cross-attention mechanism. Our experimental results show that the fusion of polarization and shading priors can significantly improve the reconstruction quality of surface normal, especially for objects or scenes illuminated by complex lighting sources. As a result, SfPSNet shows state-of-the-art performance compared with existing deep learning-based SfP methods benefiting from its efficiency in extracting and fusing information from different priors.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
物理先验引导的深度融合网络,通过偏振的阴影线索了解形状
来自偏振的形状(SfP)是一种强大的被动三维成像技术,能够重建具有密集纹理细节的表面法线。然而,现有的基于深度学习的 SfP 方法只关注偏振先验,难以在复杂场景下准确重建具有丰富纹理细节的目标。为了提高重建精度,我们利用阴影线索估算的表面法线和创新性地提出的镜面置信度作为阴影先验,以提供额外的特征信息。此外,为了有效结合偏振先验和阴影先验,我们提出了一种名为 SfPSNet 的新型深度融合网络,用于信息提取和表面法线重建。SfPSNet 基于双分支架构实现,可处理不同的物理前验。专门设计了一个特征校正模块,分别在通道维度和空间维度上相互修正缺陷。此外,我们还提出了一个特征融合模块,基于高效的交叉注意机制融合偏振和阴影先验的特征图。实验结果表明,偏振和阴影前验的融合能显著提高表面法线的重建质量,尤其是对于复杂光源照射的物体或场景。因此,与现有的基于深度学习的 SfP 方法相比,SfPSNet 得益于其从不同前验中提取和融合信息的效率,表现出了最先进的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Information Fusion
Information Fusion 工程技术-计算机:理论方法
CiteScore
33.20
自引率
4.30%
发文量
161
审稿时长
7.9 months
期刊介绍: Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.
期刊最新文献
Editorial Board FedKD-IDS: A robust intrusion detection system using knowledge distillation-based semi-supervised federated learning and anti-poisoning attack mechanism Physical prior-guided deep fusion network with shading cues for shape from polarization Incomplete multi-view clustering based on hypergraph Self-supervised learning-based multi-source spectral fusion for fruit quality evaluation:a case study in mango fruit ripeness prediction
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1