Live demonstration: Neuromorphic event-based multi-kernel algorithm for high speed visual features tracking

Xavier Lagorce, Cedric Meyer, S. Ieng, David Filliat, R. Benosman
{"title":"Live demonstration: Neuromorphic event-based multi-kernel algorithm for high speed visual features tracking","authors":"Xavier Lagorce, Cedric Meyer, S. Ieng, David Filliat, R. Benosman","doi":"10.1109/BioCAS.2014.6981681","DOIUrl":null,"url":null,"abstract":"This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted to the scene-driven properties of these sensors. The method allows to track multiple visual features in real time at a frequency of several hundreds kilohertz. It adapts to scene contents, combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels are used to track features from incoming events such as Gaussian, Gabor, combinations of Gabor functions and any hand-made kernel with very weak constraints. The proposed features tracking method can deal with feature variations in position, scale and orientation. The tracking performance is evaluated experimentally for each kernel to prove the robustness of the proposed solution.","PeriodicalId":414575,"journal":{"name":"2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BioCAS.2014.6981681","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted to the scene-driven properties of these sensors. The method allows to track multiple visual features in real time at a frequency of several hundreds kilohertz. It adapts to scene contents, combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels are used to track features from incoming events such as Gaussian, Gabor, combinations of Gabor functions and any hand-made kernel with very weak constraints. The proposed features tracking method can deal with feature variations in position, scale and orientation. The tracking performance is evaluated experimentally for each kernel to prove the robustness of the proposed solution.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
现场演示:基于神经形态事件的多核高速视觉特征跟踪算法
这个演示展示了一种使用基于事件的异步神经形态事件相机输出的视觉跟踪方法。该方法是基于事件的,因此适应这些传感器的场景驱动特性。该方法允许以数百千赫兹的频率实时跟踪多个视觉特征。它适应场景内容,在异步迭代框架中结合事件的空间和时间相关性。各种各样的核函数用于跟踪来自传入事件的特征,如高斯函数、Gabor函数、Gabor函数的组合以及任何具有非常弱约束的手工核函数。所提出的特征跟踪方法可以处理特征在位置、尺度和方向上的变化。对每个核的跟踪性能进行了实验评估,以证明该方法的鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Guidewire insertion planning for extracapsular hip fracture surgery A compact ECoG system with bidirectional capacitive data telemetry Omnidirectional wireless power combination harvest for wireless endoscopy Database-driven artifact detection method for EEG systems with few channels (DAD) ESL design of customizable real-time neuron networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1