An Occlusion-Resolving Hand Tracking Method

Yea-Shuan Huang, Y. Chen
{"title":"An Occlusion-Resolving Hand Tracking Method","authors":"Yea-Shuan Huang, Y. Chen","doi":"10.1109/U-MEDIA.2014.17","DOIUrl":null,"url":null,"abstract":"This paper proposes a novel algorithm for real-time hand detection and tracking, and this algorithm can successfully track a hand even when it is overlapped with other skin-color objects during tracking. An on-line adaptive learning approach associated with negative skin-color exclusion is used to fit the skin color distribution of each individual hand in various environments. When skin-color objects have been extracted, three states (separation, proximity and overlap) between tracked objects are defined. A separation template image of the tracking hand is created whenever it is in the proximity state, and a feature-point-based matching comparison by using the newly created separation template is conducted when it is in the overlap state. The experimental results show the proposed algorithm not only can obtain a highly accurate hand tracking rate in various situations, but also can run in real time with 30-45 frames per second.","PeriodicalId":174849,"journal":{"name":"2014 7th International Conference on Ubi-Media Computing and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 7th International Conference on Ubi-Media Computing and Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/U-MEDIA.2014.17","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

This paper proposes a novel algorithm for real-time hand detection and tracking, and this algorithm can successfully track a hand even when it is overlapped with other skin-color objects during tracking. An on-line adaptive learning approach associated with negative skin-color exclusion is used to fit the skin color distribution of each individual hand in various environments. When skin-color objects have been extracted, three states (separation, proximity and overlap) between tracked objects are defined. A separation template image of the tracking hand is created whenever it is in the proximity state, and a feature-point-based matching comparison by using the newly created separation template is conducted when it is in the overlap state. The experimental results show the proposed algorithm not only can obtain a highly accurate hand tracking rate in various situations, but also can run in real time with 30-45 frames per second.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种消除闭塞的手部跟踪方法
本文提出了一种用于实时手部检测和跟踪的新型算法,即使在跟踪过程中手部与其他肤色物体重叠,该算法也能成功跟踪。该算法采用了一种与负肤色排除相关的在线自适应学习方法,以拟合每只手在不同环境中的肤色分布。在提取出肤色物体后,会定义跟踪物体之间的三种状态(分离、接近和重叠)。当手部处于接近状态时,会创建跟踪手部的分离模板图像;当手部处于重叠状态时,会使用新创建的分离模板进行基于特征点的匹配比较。实验结果表明,所提出的算法不仅能在各种情况下获得高精度的手部跟踪率,而且能以每秒 30-45 帧的速度实时运行。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
On the Usage of WiFi and LTE for the Smart Grid Energy-Efficient Resource Allocation Model with QoS Assurance for Ubiquitous and Heterogeneous Environment Energy Consumption Estimation for Wireless Sensor Network Layout Optimization Mobile Augmented Reality of Tourism-Yilan Hot Spring Switchover of Radio and Television Broadcasting into Digital Technology in Mongolia
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1