使用合成孔径重建遮挡表面:立体、聚焦和鲁棒测量

V. Vaish, M. Levoy, R. Szeliski, C. L. Zitnick, S. B. Kang
{"title":"使用合成孔径重建遮挡表面:立体、聚焦和鲁棒测量","authors":"V. Vaish, M. Levoy, R. Szeliski, C. L. Zitnick, S. B. Kang","doi":"10.1109/CVPR.2006.244","DOIUrl":null,"url":null,"abstract":"Most algorithms for 3D reconstruction from images use cost functions based on SSD, which assume that the surfaces being reconstructed are visible to all cameras. This makes it difficult to reconstruct objects which are partially occluded. Recently, researchers working with large camera arrays have shown it is possible to \"see through\" occlusions using a technique called synthetic aperture focusing. This suggests that we can design alternative cost functions that are robust to occlusions using synthetic apertures. Our paper explores this design space. We compare classical shape from stereo with shape from synthetic aperture focus. We also describe two variants of multi-view stereo based on color medians and entropy that increase robustness to occlusions. We present an experimental comparison of these cost functions on complex light fields, measuring their accuracy against the amount of occlusion.","PeriodicalId":421737,"journal":{"name":"2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"226","resultStr":"{\"title\":\"Reconstructing Occluded Surfaces Using Synthetic Apertures: Stereo, Focus and Robust Measures\",\"authors\":\"V. Vaish, M. Levoy, R. Szeliski, C. L. Zitnick, S. B. Kang\",\"doi\":\"10.1109/CVPR.2006.244\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Most algorithms for 3D reconstruction from images use cost functions based on SSD, which assume that the surfaces being reconstructed are visible to all cameras. This makes it difficult to reconstruct objects which are partially occluded. Recently, researchers working with large camera arrays have shown it is possible to \\\"see through\\\" occlusions using a technique called synthetic aperture focusing. This suggests that we can design alternative cost functions that are robust to occlusions using synthetic apertures. Our paper explores this design space. We compare classical shape from stereo with shape from synthetic aperture focus. We also describe two variants of multi-view stereo based on color medians and entropy that increase robustness to occlusions. We present an experimental comparison of these cost functions on complex light fields, measuring their accuracy against the amount of occlusion.\",\"PeriodicalId\":421737,\"journal\":{\"name\":\"2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2006-06-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"226\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CVPR.2006.244\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPR.2006.244","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 226

摘要

大多数从图像进行3D重建的算法都使用基于SSD的成本函数,该算法假设被重建的表面对所有相机都是可见的。这使得重建部分遮挡的物体变得困难。最近,研究人员利用大型相机阵列表明,使用一种称为合成孔径聚焦的技术,可以“透视”遮挡物。这表明我们可以使用合成孔径设计对闭塞具有鲁棒性的替代成本函数。我们的论文探讨了这个设计空间。我们比较了立体的经典形状和合成光圈聚焦的形状。我们还描述了两种基于颜色中值和熵的多视图立体图像变体,以增加对遮挡的鲁棒性。我们在复杂光场上对这些代价函数进行了实验比较,测量了它们对遮挡量的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Reconstructing Occluded Surfaces Using Synthetic Apertures: Stereo, Focus and Robust Measures
Most algorithms for 3D reconstruction from images use cost functions based on SSD, which assume that the surfaces being reconstructed are visible to all cameras. This makes it difficult to reconstruct objects which are partially occluded. Recently, researchers working with large camera arrays have shown it is possible to "see through" occlusions using a technique called synthetic aperture focusing. This suggests that we can design alternative cost functions that are robust to occlusions using synthetic apertures. Our paper explores this design space. We compare classical shape from stereo with shape from synthetic aperture focus. We also describe two variants of multi-view stereo based on color medians and entropy that increase robustness to occlusions. We present an experimental comparison of these cost functions on complex light fields, measuring their accuracy against the amount of occlusion.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Dynamic Bayesian Network Model for Autonomous 3D Reconstruction from a Single Indoor Image Efficient Maximally Stable Extremal Region (MSER) Tracking Transformation invariant component analysis for binary images Region-Tree Based Stereo Using Dynamic Programming Optimization Probabilistic 3D Polyp Detection in CT Images: The Role of Sample Alignment
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1