Fusing edges and feature points for robust target tracking

W. Li, Zelin Shi, Jian Yin, Qinghai Ding
{"title":"Fusing edges and feature points for robust target tracking","authors":"W. Li, Zelin Shi, Jian Yin, Qinghai Ding","doi":"10.1117/12.900721","DOIUrl":null,"url":null,"abstract":"Feature points and object edges are two kinds of primitives which are frequently used in target tracking algorithms. Feature points can be easily localized in an image. Their correspondences between images can be detected accurately. They can adapt to wide baseline transformations. However, feature points are not so stable that they are fragile to changes in illumination and viewpoint. On the contrary, object edges are stable under a very wide range of illumination and viewpoint changes. Unfortunately, edge-based algorithms often fail in the presence of highly textured targets and clutter which produce too many irrelevant edges. We found that both edge-based and point-based tracking have failure modes which are complementary. Based on this analysis, we propose a novel tracking algorithm which fuses point and edge features. Our tracking algorithm uses feature points matching to track object first, and then uses the transformation parameters archived in the first step to initialize the edge tracking. By this means, our algorithm alleviates the disturbance of irrelevant edges. Then, we use the texture boundary detection algorithm to find the precise object boundary. Texture boundary detection is different from the conventional gradient-based edge detection which can directly compute the most probable location of a texture boundary on the search line. Therefore, it is very fast and can be incorporated into a real-time tracking algorithm. Experimental results show that our tracking algorithm has outstanding tracking accuracy and robustness.","PeriodicalId":355017,"journal":{"name":"Photoelectronic Detection and Imaging","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Photoelectronic Detection and Imaging","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.900721","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Feature points and object edges are two kinds of primitives which are frequently used in target tracking algorithms. Feature points can be easily localized in an image. Their correspondences between images can be detected accurately. They can adapt to wide baseline transformations. However, feature points are not so stable that they are fragile to changes in illumination and viewpoint. On the contrary, object edges are stable under a very wide range of illumination and viewpoint changes. Unfortunately, edge-based algorithms often fail in the presence of highly textured targets and clutter which produce too many irrelevant edges. We found that both edge-based and point-based tracking have failure modes which are complementary. Based on this analysis, we propose a novel tracking algorithm which fuses point and edge features. Our tracking algorithm uses feature points matching to track object first, and then uses the transformation parameters archived in the first step to initialize the edge tracking. By this means, our algorithm alleviates the disturbance of irrelevant edges. Then, we use the texture boundary detection algorithm to find the precise object boundary. Texture boundary detection is different from the conventional gradient-based edge detection which can directly compute the most probable location of a texture boundary on the search line. Therefore, it is very fast and can be incorporated into a real-time tracking algorithm. Experimental results show that our tracking algorithm has outstanding tracking accuracy and robustness.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于边缘和特征点融合的鲁棒目标跟踪
特征点和目标边缘是目标跟踪算法中常用的两种基元。特征点可以很容易地在图像中定位。它们在图像之间的对应关系可以被准确地检测出来。它们可以适应宽基线转换。然而,特征点并不稳定,容易受到光照和视点变化的影响。相反,在很宽的光照范围和视点变化下,物体边缘是稳定的。不幸的是,基于边缘的算法在高度纹理化的目标和产生太多不相关边缘的杂波存在时往往失败。我们发现基于边缘和基于点的跟踪都具有互补的失效模式。在此基础上,提出了一种融合点与边特征的跟踪算法。我们的跟踪算法首先使用特征点匹配对目标进行跟踪,然后使用第一步中存档的变换参数初始化边缘跟踪。通过这种方法,我们的算法减轻了不相关边的干扰。然后,我们使用纹理边界检测算法找到精确的目标边界。纹理边界检测不同于传统的基于梯度的边缘检测,它可以直接计算出纹理边界在搜索线上最可能的位置。因此,它的速度非常快,可以纳入到实时跟踪算法中。实验结果表明,该跟踪算法具有良好的跟踪精度和鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Design and characterization of radiation tolerant CMOS image sensor for space applications Measuring the steel tensile deformation based on linear CCD 3D hand and palmprint acquisition using full-field composite color fringe projection Research on surface free energy of electrowetting liquid zoom lens Research on inside surface of hollow reactor based on photoelectric detecting technique
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1