Styling Words: A Simple and Natural Way to Increase Variability in Training Data Collection for Gesture Recognition

Woojin Kang, Intaek Jung, Daeho Lee, Jin-Hyuk Hong
{"title":"Styling Words: A Simple and Natural Way to Increase Variability in Training Data Collection for Gesture Recognition","authors":"Woojin Kang, Intaek Jung, Daeho Lee, Jin-Hyuk Hong","doi":"10.1145/3411764.3445457","DOIUrl":null,"url":null,"abstract":"Due to advances in deep learning, gestures have become a more common tool for human-computer interaction. When implementing a large amount of training data, deep learning models show remarkable performance in gesture recognition. Since it is expensive and time consuming to collect gesture data from people, we are often confronted with a practicality issue when managing the quantity and quality of training data. It is a well-known fact that increasing training data variability can help to improve the generalization performance of machine learning models. Thus, we directly intervene in the collection of gesture data to increase human gesture variability by adding some words (called styling words) into the data collection instructions, e.g., giving the instruction \"perform gesture #1 faster\" as opposed to \"perform gesture #1.\" Through an in-depth analysis of gesture features and video-based gesture recognition, we have confirmed the advantageous use of styling words in gesture training data collection.","PeriodicalId":20451,"journal":{"name":"Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems","volume":"2014 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3411764.3445457","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Due to advances in deep learning, gestures have become a more common tool for human-computer interaction. When implementing a large amount of training data, deep learning models show remarkable performance in gesture recognition. Since it is expensive and time consuming to collect gesture data from people, we are often confronted with a practicality issue when managing the quantity and quality of training data. It is a well-known fact that increasing training data variability can help to improve the generalization performance of machine learning models. Thus, we directly intervene in the collection of gesture data to increase human gesture variability by adding some words (called styling words) into the data collection instructions, e.g., giving the instruction "perform gesture #1 faster" as opposed to "perform gesture #1." Through an in-depth analysis of gesture features and video-based gesture recognition, we have confirmed the advantageous use of styling words in gesture training data collection.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
样式词:一种增加手势识别训练数据收集可变性的简单而自然的方法
由于深度学习的进步,手势已经成为一种更常见的人机交互工具。在实现大量训练数据时,深度学习模型在手势识别方面表现出显著的性能。由于人体手势数据的采集成本高、耗时长,在管理训练数据的数量和质量时,我们经常面临一个实用性问题。众所周知,增加训练数据的可变性有助于提高机器学习模型的泛化性能。因此,我们直接干预手势数据的收集,通过在数据收集指令中添加一些单词(称为样式词)来增加人类手势的可变性,例如,给出指令“更快地执行手势#1”,而不是“执行手势#1”。通过对手势特征和基于视频的手势识别的深入分析,我们证实了样式词在手势训练数据收集中的优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Tea, Earl Grey, Hot: Designing Speech Interactions from the Imagined Ideal of Star Trek DistanciAR: Authoring Site-Specific Augmented Reality Experiences for Remote Environments StoryCoder: Teaching Computational Thinking Concepts Through Storytelling in a Voice-Guided App for Children Assisting Manipulation and Grasping in Robot Teleoperation with Augmented Reality Visual Cues Exploring Technology Design for Students with Vision Impairment in the Classroom and Remotely
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1