RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowledge on User-Defined Gestures

Bogdan-Florin Gheran, Santiago Villarreal-Narvaez, Radu-Daniel Vatavu, J. Vanderdonckt
{"title":"RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowledge on User-Defined Gestures","authors":"Bogdan-Florin Gheran, Santiago Villarreal-Narvaez, Radu-Daniel Vatavu, J. Vanderdonckt","doi":"10.1145/3531073.3531112","DOIUrl":null,"url":null,"abstract":"The body of knowledge accumulated by gesture elicitation studies (GES), although useful, large, and extensive, is also heterogeneous, scattered in the scientific literature across different venues and fields of research, and difficult to generalize to other contexts of use represented by different gesture types, sensing devices, applications, and user categories. To address such aspects, we introduce RepliGES, a conceptual space that supports (1) replications of gesture elicitation studies to confirm, extend, and complete previous findings, (2) reuse of previously elicited gesture sets to enable new discoveries, and (3) extension and generalization of previous findings with new methods of analysis and for new user populations towards consolidated knowledge of user-defined gestures. Based on RepliGES, we introduce GEStory, an interactive design space and visual tool, to structure, visualize and identify user-defined gestures from a number of 216 published gesture elicitation studies.","PeriodicalId":412533,"journal":{"name":"Proceedings of the 2022 International Conference on Advanced Visual Interfaces","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 International Conference on Advanced Visual Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3531073.3531112","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

The body of knowledge accumulated by gesture elicitation studies (GES), although useful, large, and extensive, is also heterogeneous, scattered in the scientific literature across different venues and fields of research, and difficult to generalize to other contexts of use represented by different gesture types, sensing devices, applications, and user categories. To address such aspects, we introduce RepliGES, a conceptual space that supports (1) replications of gesture elicitation studies to confirm, extend, and complete previous findings, (2) reuse of previously elicited gesture sets to enable new discoveries, and (3) extension and generalization of previous findings with new methods of analysis and for new user populations towards consolidated knowledge of user-defined gestures. Based on RepliGES, we introduce GEStory, an interactive design space and visual tool, to structure, visualize and identify user-defined gestures from a number of 216 published gesture elicitation studies.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
RepliGES和GEStory:系统化和巩固用户定义手势知识的可视化工具
手势启发研究(GES)所积累的知识体系虽然有用、庞大、广泛,但也存在异质性,分散在不同场所和研究领域的科学文献中,难以推广到由不同手势类型、传感设备、应用和用户类别所代表的其他使用环境中。为了解决这些问题,我们引入了RepliGES,这是一个概念空间,它支持(1)手势引出研究的复制,以确认、扩展和完成以前的发现;(2)重用以前引出的手势集,以实现新的发现;(3)用新的分析方法扩展和概括以前的发现,并为新的用户群体提供用户自定义手势的巩固知识。基于RepliGES,我们介绍了GEStory,一个交互式设计空间和可视化工具,从216个已发表的手势启发研究中构建、可视化和识别用户自定义手势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
EcoGO: Combining eco-feedback and gamification to improve the sustainability of driving style DeBORAh: A Web-Based Cross-Device Orchestration Layer CoPDA 2022 - Cultures of Participation in the Digital Age: AI for Humans or Humans for AI? Exploring a Multi-Device Immersive Learning Environment End-User Programming and Math Teachers: an Initial Study
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1