A fast dense stereo matching algorithm with an application to 3D occupancy mapping using quadrocopters

Radouane Ait Jellal, A. Zell
{"title":"A fast dense stereo matching algorithm with an application to 3D occupancy mapping using quadrocopters","authors":"Radouane Ait Jellal, A. Zell","doi":"10.1109/ICAR.2015.7251515","DOIUrl":null,"url":null,"abstract":"We propose a fast algorithm for computing stereo correspondences and correcting the mismatches. The correspondences are computed using stereo block matching and refined with a depth-aware method. We compute 16 disparities at the same time using SSE instructions. We evaluated our method on the Middlebury benchmark and obtained promosing results for practical realtime applications. The use of SSE instructions allows us to reduce the time needed to process the Tsukuba stereo pair to 8 milliseconds (125 fps) on a Core i5 CPU with 2×3.3 GHz. Our disparity refinement method has corrected 40% of the wrong matches with an additional computational time of 5.2% (0.41ms). The algorithm has been used to build 3D occupancy grid maps from stereo images. We used the datasets provided by the EuRoC Robotic Challenge. The reconstruction was accurate enough to perform realtime safe navigation.","PeriodicalId":432004,"journal":{"name":"2015 International Conference on Advanced Robotics (ICAR)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference on Advanced Robotics (ICAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAR.2015.7251515","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

We propose a fast algorithm for computing stereo correspondences and correcting the mismatches. The correspondences are computed using stereo block matching and refined with a depth-aware method. We compute 16 disparities at the same time using SSE instructions. We evaluated our method on the Middlebury benchmark and obtained promosing results for practical realtime applications. The use of SSE instructions allows us to reduce the time needed to process the Tsukuba stereo pair to 8 milliseconds (125 fps) on a Core i5 CPU with 2×3.3 GHz. Our disparity refinement method has corrected 40% of the wrong matches with an additional computational time of 5.2% (0.41ms). The algorithm has been used to build 3D occupancy grid maps from stereo images. We used the datasets provided by the EuRoC Robotic Challenge. The reconstruction was accurate enough to perform realtime safe navigation.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种快速密集立体匹配算法,应用于使用四旋翼飞行器的三维占用映射
我们提出了一种快速计算立体对应并校正不匹配的算法。使用立体块匹配计算对应关系,并使用深度感知方法进行细化。我们使用SSE指令同时计算16个差异。我们在Middlebury基准上对我们的方法进行了评估,并获得了实际实时应用的推广结果。SSE指令的使用使我们能够将处理筑波立体声对所需的时间减少到8毫秒(125 fps),在酷睿i5 CPU上使用2×3.3 GHz。我们的视差细化方法校正了40%的错误匹配,增加了5.2% (0.41ms)的计算时间。该算法已被用于从立体图像构建三维占用网格地图。我们使用了欧洲机器人挑战赛提供的数据集。重建足够精确,可以进行实时安全导航。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
On the EMG-based torque estimation for humans coupled with a force-controlled elbow exoskeleton The KIT whole-body human motion database Visual matching of stroke order in robotic calligraphy Real-time motion adaptation using relative distance space representation Optimization of the switching surface for the simplest passive dynamic biped
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1