Constant Time Stereo Matching

Myung-Ho Ju, Hang-Bong Kang
{"title":"Constant Time Stereo Matching","authors":"Myung-Ho Ju, Hang-Bong Kang","doi":"10.1109/IMVIP.2009.10","DOIUrl":null,"url":null,"abstract":"Typically, local methods for stereo matching are fast but have relatively low degree of accuracy while global ones, though costly, achieve a higher degree of accuracy in retrieving disparity information. Recently, however, some local methods such as those based on segmentation or adaptive weights are suggested to possibly achieve more accuracy than global ones in retrieving disparity information. The problem for these newly suggested local methods is that they cannot be easily adopted since they may require more computational costs which increase in proportion to the window size they use. To reduce the computational costs, therefore, we propose in this paper the stereo matching method that use domain weight and range weight similar to those in the bilateral filter. Our proposed method shows constant time O(1) for the stereo matching. Our experiments spend constant time for computation regardless of the window size but our experimental results show that the accuracy of generated depth map is as good as the ones suggested by recent methods.","PeriodicalId":179564,"journal":{"name":"2009 13th International Machine Vision and Image Processing Conference","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2009-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 13th International Machine Vision and Image Processing Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IMVIP.2009.10","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17

Abstract

Typically, local methods for stereo matching are fast but have relatively low degree of accuracy while global ones, though costly, achieve a higher degree of accuracy in retrieving disparity information. Recently, however, some local methods such as those based on segmentation or adaptive weights are suggested to possibly achieve more accuracy than global ones in retrieving disparity information. The problem for these newly suggested local methods is that they cannot be easily adopted since they may require more computational costs which increase in proportion to the window size they use. To reduce the computational costs, therefore, we propose in this paper the stereo matching method that use domain weight and range weight similar to those in the bilateral filter. Our proposed method shows constant time O(1) for the stereo matching. Our experiments spend constant time for computation regardless of the window size but our experimental results show that the accuracy of generated depth map is as good as the ones suggested by recent methods.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
恒时立体匹配
通常,局部立体匹配方法速度快,但精度较低,而全局匹配方法虽然成本高,但在检索视差信息方面具有较高的精度。然而,近年来,人们提出了一些局部方法,如基于分割或自适应权重的方法,以期在检索视差信息方面取得比全局方法更高的精度。这些新提出的局部方法的问题是,它们不容易被采用,因为它们可能需要更多的计算成本,这些计算成本随着它们使用的窗口大小的比例而增加。因此,为了减少计算量,本文提出了类似双边滤波中使用域权值和距离权值的立体匹配方法。我们提出的方法为立体匹配提供了恒定的时间O(1)。实验表明,无论窗口大小如何,我们的计算时间都是恒定的,但我们的实验结果表明,所生成的深度图的精度与目前的方法一样好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Constrained Region-Based Segmentation of Pleural Effusion in Thin-Slice CT Recognizing Spatiotemporal Gestures and Movement Epenthesis in Sign Language Constant Time Stereo Matching Denoising Magnetic Resonance Images Using Fourth Order Complex Diffusion Feature-Assisted Sparse to Dense Motion Estimation Using Geodesic Distances
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1