Refinement of disparity estimates through the fusion of monocular image segmentations

D. McKeown, F. Perlant
{"title":"Refinement of disparity estimates through the fusion of monocular image segmentations","authors":"D. McKeown, F. Perlant","doi":"10.1109/CVPR.1992.223146","DOIUrl":null,"url":null,"abstract":"The authors examine how estimates of three-dimensional scene structure, as encoded in a scene disparity map, can be improved by the analysis of the original monocular imagery. They describe the utilization of surface illumination information provided by the segmentation of the monocular image into fine surface patches of nearly homogeneous intensity to remove mismatches generated during stereo matching. These patches are used to guide a statistical analysis of the disparity map based on the assumption that such patches correspond closely with physical surfaces in the scene. Such a technique is quite independent of whether the initial disparity map was generated by automated area-based or feature-based stereo matching. Refinement results on complex urban scenes containing various man-made and natural features are presented, and the improvements due to monocular fusion with a set of different region-based image segmentations are demonstrated.<<ETX>>","PeriodicalId":325476,"journal":{"name":"Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition","volume":"452 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPR.1992.223146","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

The authors examine how estimates of three-dimensional scene structure, as encoded in a scene disparity map, can be improved by the analysis of the original monocular imagery. They describe the utilization of surface illumination information provided by the segmentation of the monocular image into fine surface patches of nearly homogeneous intensity to remove mismatches generated during stereo matching. These patches are used to guide a statistical analysis of the disparity map based on the assumption that such patches correspond closely with physical surfaces in the scene. Such a technique is quite independent of whether the initial disparity map was generated by automated area-based or feature-based stereo matching. Refinement results on complex urban scenes containing various man-made and natural features are presented, and the improvements due to monocular fusion with a set of different region-based image segmentations are demonstrated.<>
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过融合单眼图像分割来改进视差估计
作者研究了如何通过分析原始单眼图像来改进场景视差图中编码的三维场景结构估计。他们描述了利用单眼图像分割成几乎均匀强度的精细表面斑块来消除立体匹配过程中产生的不匹配所提供的表面照明信息。这些补丁用于指导视差图的统计分析,基于这些补丁与场景中的物理表面紧密对应的假设。这种技术完全独立于初始视差图是通过基于区域的自动立体匹配还是基于特征的立体匹配生成的。给出了包含各种人造和自然特征的复杂城市场景的细化结果,并演示了使用一组不同的基于区域的图像分割进行单目融合所带来的改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Motion trajectories An heterogeneous M-SIMD architecture for Kalman filter controlled processing of image sequences Recognizing 3D objects from 2D images: an error analysis On the derivation of geometric constraints in stereo Computing stereo correspondences in the presence of narrow occluding objects
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1