Asynchronous event-based hebbian epipolar geometry.

IEEE transactions on neural networks Pub Date : 2011-11-01 Epub Date: 2011-09-26 DOI:10.1109/TNN.2011.2167239
Ryad Benosman, Sio-Hoï Ieng, Paul Rogister, Christoph Posch
{"title":"Asynchronous event-based hebbian epipolar geometry.","authors":"Ryad Benosman,&nbsp;Sio-Hoï Ieng,&nbsp;Paul Rogister,&nbsp;Christoph Posch","doi":"10.1109/TNN.2011.2167239","DOIUrl":null,"url":null,"abstract":"<p><p>Epipolar geometry, the cornerstone of perspective stereo vision, has been studied extensively since the advent of computer vision. Establishing such a geometric constraint is of primary importance, as it allows the recovery of the 3-D structure of scenes. Estimating the epipolar constraints of nonperspective stereo is difficult, they can no longer be defined because of the complexity of the sensor geometry. This paper will show that these limitations are, to some extent, a consequence of the static image frames commonly used in vision. The conventional frame-based approach suffers from a lack of the dynamics present in natural scenes. We introduce the use of neuromorphic event-based--rather than frame-based--vision sensors for perspective stereo vision. This type of sensor uses the dimension of time as the main conveyor of information. In this paper, we present a model for asynchronous event-based vision, which is then used to derive a general new concept of epipolar geometry linked to the temporal activation of pixels. Practical experiments demonstrate the validity of the approach, solving the problem of estimating the fundamental matrix applied, in a first stage, to classic perspective vision and then to more general cameras. Furthermore, this paper shows that the properties of event-based vision sensors allow the exploration of not-yet-defined geometric relationships, finally, we provide a definition of general epipolar geometry deployable to almost any visual sensor.</p>","PeriodicalId":13434,"journal":{"name":"IEEE transactions on neural networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/TNN.2011.2167239","citationCount":"48","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TNN.2011.2167239","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2011/9/26 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 48

Abstract

Epipolar geometry, the cornerstone of perspective stereo vision, has been studied extensively since the advent of computer vision. Establishing such a geometric constraint is of primary importance, as it allows the recovery of the 3-D structure of scenes. Estimating the epipolar constraints of nonperspective stereo is difficult, they can no longer be defined because of the complexity of the sensor geometry. This paper will show that these limitations are, to some extent, a consequence of the static image frames commonly used in vision. The conventional frame-based approach suffers from a lack of the dynamics present in natural scenes. We introduce the use of neuromorphic event-based--rather than frame-based--vision sensors for perspective stereo vision. This type of sensor uses the dimension of time as the main conveyor of information. In this paper, we present a model for asynchronous event-based vision, which is then used to derive a general new concept of epipolar geometry linked to the temporal activation of pixels. Practical experiments demonstrate the validity of the approach, solving the problem of estimating the fundamental matrix applied, in a first stage, to classic perspective vision and then to more general cameras. Furthermore, this paper shows that the properties of event-based vision sensors allow the exploration of not-yet-defined geometric relationships, finally, we provide a definition of general epipolar geometry deployable to almost any visual sensor.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于异步事件的hebbian极几何。
极极几何是透视立体视觉的基础,自计算机视觉出现以来一直受到广泛的研究。建立这样的几何约束是至关重要的,因为它可以恢复场景的三维结构。由于传感器几何结构的复杂性,非透视立体的极面约束估计是一个困难的问题。本文将表明,在某种程度上,这些限制是视觉中常用的静态图像帧的结果。传统的基于框架的方法缺乏自然场景中的动态。我们介绍了基于事件的神经形态视觉传感器的使用,而不是基于框架的视角立体视觉。这种类型的传感器以时间维度作为信息的主要传送带。在本文中,我们提出了一个基于异步事件的视觉模型,然后将其用于导出与像素的时间激活相关的极极几何的一般新概念。实际实验证明了该方法的有效性,解决了基本矩阵的估计问题,首先应用于经典的透视视觉,然后应用于更通用的相机。此外,本文表明基于事件的视觉传感器的特性允许探索尚未定义的几何关系,最后,我们提供了一个可部署到几乎任何视觉传感器的一般极几何的定义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE transactions on neural networks
IEEE transactions on neural networks 工程技术-工程:电子与电气
自引率
0.00%
发文量
2
审稿时长
8.7 months
期刊最新文献
Extracting rules from neural networks as decision diagrams. Design of a data-driven predictive controller for start-up process of AMT vehicles. Data-based hybrid tension estimation and fault diagnosis of cold rolling continuous annealing processes. Unified development of multiplicative algorithms for linear and quadratic nonnegative matrix factorization. Data-based system modeling using a type-2 fuzzy neural network with a hybrid learning algorithm.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1