Unsupervised light field disparity estimation using confidence weight and occlusion-aware

IF 3.7 2区 工程技术 Q2 OPTICS Optics and Lasers in Engineering Pub Date : 2025-06-01 Epub Date: 2025-03-05 DOI:10.1016/j.optlaseng.2025.108928
Bo Xiao , Xiujing Gao , Huadong Zheng , Huibao Yang , Hongwu Huang
{"title":"Unsupervised light field disparity estimation using confidence weight and occlusion-aware","authors":"Bo Xiao ,&nbsp;Xiujing Gao ,&nbsp;Huadong Zheng ,&nbsp;Huibao Yang ,&nbsp;Hongwu Huang","doi":"10.1016/j.optlaseng.2025.108928","DOIUrl":null,"url":null,"abstract":"<div><div>Light field disparity estimation is a crucial topic in computer vision. Currently, deep learning methods have shown significantly improved performance compared to traditional methods, especially supervised learning approaches. However, the high cost of obtaining real-world depth/disparity data for training greatly limits the generalization ability of supervised learning methods. In this paper, we propose an unsupervised learning method for light field depth estimation by utilizing confidence weights to evaluate the reliability of disparity features. First, during the disparity estimation and inference process, we introduce confidence weights to assess the reliability of disparity features, assigning higher weights to non-occluded and low-noise areas to effectively handle errors caused by occlusion and noise. Second, we design an occlusion-aware network to predict occluded regions in the views, which addresses the interference of occluded regions when computing unsupervised loss during training, thus enhancing the overall estimation accuracy. Extensive experimental results show that our method outperforms traditional methods and some of the latest unsupervised learning methods.</div></div>","PeriodicalId":49719,"journal":{"name":"Optics and Lasers in Engineering","volume":"189 ","pages":"Article 108928"},"PeriodicalIF":3.7000,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optics and Lasers in Engineering","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0143816625001150","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/3/5 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Light field disparity estimation is a crucial topic in computer vision. Currently, deep learning methods have shown significantly improved performance compared to traditional methods, especially supervised learning approaches. However, the high cost of obtaining real-world depth/disparity data for training greatly limits the generalization ability of supervised learning methods. In this paper, we propose an unsupervised learning method for light field depth estimation by utilizing confidence weights to evaluate the reliability of disparity features. First, during the disparity estimation and inference process, we introduce confidence weights to assess the reliability of disparity features, assigning higher weights to non-occluded and low-noise areas to effectively handle errors caused by occlusion and noise. Second, we design an occlusion-aware network to predict occluded regions in the views, which addresses the interference of occluded regions when computing unsupervised loss during training, thus enhancing the overall estimation accuracy. Extensive experimental results show that our method outperforms traditional methods and some of the latest unsupervised learning methods.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于置信度权重和遮挡感知的无监督光场视差估计
光场视差估计是计算机视觉中的一个重要课题。目前,深度学习方法与传统方法相比,特别是与监督学习方法相比,性能有了显著提高。然而,获取真实世界深度/视差数据用于训练的高成本极大地限制了监督学习方法的泛化能力。本文提出了一种利用置信度权重评估视差特征可靠性的无监督学习光场深度估计方法。首先,在视差估计和推断过程中,引入置信度权重来评估视差特征的可靠性,对未遮挡和低噪声区域分配更高的权重,有效处理由遮挡和噪声引起的误差;其次,我们设计了一个遮挡感知网络来预测视图中的遮挡区域,在训练过程中计算无监督损失时解决了遮挡区域的干扰,从而提高了整体估计精度。大量的实验结果表明,我们的方法优于传统方法和一些最新的无监督学习方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Optics and Lasers in Engineering
Optics and Lasers in Engineering 工程技术-光学
CiteScore
8.90
自引率
8.70%
发文量
384
审稿时长
42 days
期刊介绍: Optics and Lasers in Engineering aims at providing an international forum for the interchange of information on the development of optical techniques and laser technology in engineering. Emphasis is placed on contributions targeted at the practical use of methods and devices, the development and enhancement of solutions and new theoretical concepts for experimental methods. Optics and Lasers in Engineering reflects the main areas in which optical methods are being used and developed for an engineering environment. Manuscripts should offer clear evidence of novelty and significance. Papers focusing on parameter optimization or computational issues are not suitable. Similarly, papers focussed on an application rather than the optical method fall outside the journal''s scope. The scope of the journal is defined to include the following: -Optical Metrology- Optical Methods for 3D visualization and virtual engineering- Optical Techniques for Microsystems- Imaging, Microscopy and Adaptive Optics- Computational Imaging- Laser methods in manufacturing- Integrated optical and photonic sensors- Optics and Photonics in Life Science- Hyperspectral and spectroscopic methods- Infrared and Terahertz techniques
期刊最新文献
High-sensitivity accelerometer based on a pendulum-like FPI with a polymer micro-string Single-step decision reinforcement learning for wavefront shaping in scattering media Direct generation of vortex pulses from a 346-MHz fiber mode-locked laser Reconfigurable multiplexing of structural-color nanoprinting and meta-holography via Sb2S3 phase-change metasurfaces Multifunctional wavefront modulation with polarization switching
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1