Three-dimensional registration and shape reconstruction from depth data without matching: A PDE approach

IF 0.5 4区 数学 Q3 MATHEMATICS Portugaliae Mathematica Pub Date : 2019-06-06 DOI:10.4171/pm/2020
D. A. Gomes, J. Costeira, João Saúde
{"title":"Three-dimensional registration and shape reconstruction from depth data without matching: A PDE approach","authors":"D. A. Gomes, J. Costeira, João Saúde","doi":"10.4171/pm/2020","DOIUrl":null,"url":null,"abstract":"The widespread availability of depth sensors like the Kinect camera makes it easy to gather three-dimensional (3D) data. However, accurately and efficiently merging large datasets collected from different views is still a core problem in computer vision. This question is particularly challenging if the relative positions of the views are not known, if there are few or no overlapping points, or if there are multiple objects. Here, we develop a method to reconstruct the 3D shapes of objects from depth data taken from different views whose relative positions are not known. Our method does not assume that common points in the views exist nor that the number of objects is known a priori. To reconstruct the shapes, we use partial differential equations (PDE) to compute upper and lower bounds for distance functions, which are solutions of the Eikonal equation constrained by the depth data. To combine various views, we minimize a function that measures the compatibility of relative positions. As we illustrate in several examples, we can reconstruct complex objects, even in the case where multiple views do not overlap, and, therefore, do not have points in common. We present several simulations to illustrate our method including multiple objects, non-convex objects, and complex shapes. Moreover, we present an application of our PDE approach to object classification from depth data. D. Gomes was partially supported by baseline and start-up funds, from King Abdullah University of Science and Technology (KAUST). J. Saúde was partially supported by by the Portuguese Foundation for Science and Technology through the Carnegie Mellon Portugal Program under the Grant SFRH/BD/52162/2013.","PeriodicalId":51269,"journal":{"name":"Portugaliae Mathematica","volume":" ","pages":""},"PeriodicalIF":0.5000,"publicationDate":"2019-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.4171/pm/2020","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Portugaliae Mathematica","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.4171/pm/2020","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

Abstract

The widespread availability of depth sensors like the Kinect camera makes it easy to gather three-dimensional (3D) data. However, accurately and efficiently merging large datasets collected from different views is still a core problem in computer vision. This question is particularly challenging if the relative positions of the views are not known, if there are few or no overlapping points, or if there are multiple objects. Here, we develop a method to reconstruct the 3D shapes of objects from depth data taken from different views whose relative positions are not known. Our method does not assume that common points in the views exist nor that the number of objects is known a priori. To reconstruct the shapes, we use partial differential equations (PDE) to compute upper and lower bounds for distance functions, which are solutions of the Eikonal equation constrained by the depth data. To combine various views, we minimize a function that measures the compatibility of relative positions. As we illustrate in several examples, we can reconstruct complex objects, even in the case where multiple views do not overlap, and, therefore, do not have points in common. We present several simulations to illustrate our method including multiple objects, non-convex objects, and complex shapes. Moreover, we present an application of our PDE approach to object classification from depth data. D. Gomes was partially supported by baseline and start-up funds, from King Abdullah University of Science and Technology (KAUST). J. Saúde was partially supported by by the Portuguese Foundation for Science and Technology through the Carnegie Mellon Portugal Program under the Grant SFRH/BD/52162/2013.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
无匹配深度数据的三维配准和形状重建:一种PDE方法
Kinect相机等深度传感器的广泛可用性使收集三维(3D)数据变得容易。然而,准确有效地合并从不同视图收集的大型数据集仍然是计算机视觉的核心问题。如果视图的相对位置未知,如果重叠点很少或没有重叠点,或者有多个对象,那么这个问题尤其具有挑战性。在这里,我们开发了一种方法,从相对位置未知的不同视图中获取的深度数据中重建物体的3D形状。我们的方法不假设视图中存在公共点,也不假设对象的数量是先验已知的。为了重建形状,我们使用偏微分方程(PDE)来计算距离函数的上界和下界,距离函数是受深度数据约束的Eikonal方程的解。为了结合各种观点,我们最小化了一个测量相对位置兼容性的函数。正如我们在几个例子中所说明的,我们可以重建复杂的对象,即使在多个视图不重叠的情况下,因此也没有共同点。我们给出了几个模拟来说明我们的方法,包括多个对象、非凸对象和复杂形状。此外,我们还介绍了我们的PDE方法在深度数据对象分类中的应用。D.戈梅斯的部分资金来自阿卜杜拉国王科技大学的基线和启动资金。J.Saúde通过卡内基梅隆葡萄牙项目获得了葡萄牙科学技术基金会的部分支持,该项目的拨款为SFRH/BD/52162/2013。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Portugaliae Mathematica
Portugaliae Mathematica MATHEMATICS, APPLIED-MATHEMATICS
CiteScore
0.90
自引率
12.50%
发文量
23
审稿时长
>12 weeks
期刊介绍: Since its foundation in 1937, Portugaliae Mathematica has aimed at publishing high-level research articles in all branches of mathematics. With great efforts by its founders, the journal was able to publish articles by some of the best mathematicians of the time. In 2001 a New Series of Portugaliae Mathematica was started, reaffirming the purpose of maintaining a high-level research journal in mathematics with a wide range scope.
期刊最新文献
Existence of solutions for critical Klein–Gordon equations coupled with Born–Infeld theory in higher dimensions Rank-one ECS manifolds of dilational type Null controllability and Stackelberg–Nash strategy for a $2\times 2$ system of parabolic equations A reverse Ozawa–Rogers estimate On generalized Wilf conjectures
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1