Deformable Spatial Pyramid Matching for Fast Dense Correspondences

Jaechul Kim, Ce Liu, Fei Sha, K. Grauman
{"title":"Deformable Spatial Pyramid Matching for Fast Dense Correspondences","authors":"Jaechul Kim, Ce Liu, Fei Sha, K. Grauman","doi":"10.1109/CVPR.2013.299","DOIUrl":null,"url":null,"abstract":"We introduce a fast deformable spatial pyramid (DSP) matching algorithm for computing dense pixel correspondences. Dense matching methods typically enforce both appearance agreement between matched pixels as well as geometric smoothness between neighboring pixels. Whereas the prevailing approaches operate at the pixel level, we propose a pyramid graph model that simultaneously regularizes match consistency at multiple spatial extents-ranging from an entire image, to coarse grid cells, to every single pixel. This novel regularization substantially improves pixel-level matching in the face of challenging image variations, while the \"deformable\" aspect of our model overcomes the strict rigidity of traditional spatial pyramids. Results on Label Me and Caltech show our approach outperforms state-of-the-art methods (SIFT Flow [15] and Patch-Match [2]), both in terms of accuracy and run time.","PeriodicalId":6343,"journal":{"name":"2013 IEEE Conference on Computer Vision and Pattern Recognition","volume":"23 1","pages":"2307-2314"},"PeriodicalIF":0.0000,"publicationDate":"2013-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"259","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE Conference on Computer Vision and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPR.2013.299","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 259

Abstract

We introduce a fast deformable spatial pyramid (DSP) matching algorithm for computing dense pixel correspondences. Dense matching methods typically enforce both appearance agreement between matched pixels as well as geometric smoothness between neighboring pixels. Whereas the prevailing approaches operate at the pixel level, we propose a pyramid graph model that simultaneously regularizes match consistency at multiple spatial extents-ranging from an entire image, to coarse grid cells, to every single pixel. This novel regularization substantially improves pixel-level matching in the face of challenging image variations, while the "deformable" aspect of our model overcomes the strict rigidity of traditional spatial pyramids. Results on Label Me and Caltech show our approach outperforms state-of-the-art methods (SIFT Flow [15] and Patch-Match [2]), both in terms of accuracy and run time.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
快速密集对应的可变形空间金字塔匹配
介绍了一种用于计算密集像素对应关系的快速可变形空间金字塔(DSP)匹配算法。密集匹配方法通常强制匹配像素之间的外观一致性以及相邻像素之间的几何平滑性。鉴于主流方法在像素级上操作,我们提出了一个金字塔图模型,该模型同时在多个空间范围(从整个图像到粗网格单元,再到每个单个像素)上正则化匹配一致性。这种新的正则化极大地提高了面对具有挑战性的图像变化时的像素级匹配,而我们模型的“可变形”方面克服了传统空间金字塔的严格刚性。Label Me和Caltech的结果表明,我们的方法在准确性和运行时间方面都优于最先进的方法(SIFT Flow[15]和Patch-Match[2])。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Segment-Tree Based Cost Aggregation for Stereo Matching Event Retrieval in Large Video Collections with Circulant Temporal Encoding Articulated and Restricted Motion Subspaces and Their Signatures Subspace Interpolation via Dictionary Learning for Unsupervised Domain Adaptation Learning Video Saliency from Human Gaze Using Candidate Selection
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1