SoundGuides: Adapting Continuous Auditory Feedback to Users

Jules Françoise, O. Chapuis, S. Hanneton, Frédéric Bevilacqua
{"title":"SoundGuides: Adapting Continuous Auditory Feedback to Users","authors":"Jules Françoise, O. Chapuis, S. Hanneton, Frédéric Bevilacqua","doi":"10.1145/2851581.2892420","DOIUrl":null,"url":null,"abstract":"We introduce SoundGuides, a user adaptable tool for auditory feedback on movement. The system is based on a interactive machine learning approach, where both gestures and sounds are first conjointly designed and conjointly learned by the system. The system can then automatically adapt the auditory feedback to any new user, taking into account the particular way each user performs a given gesture. SoundGuides is suitable for the design of continuous auditory feedback aimed at guiding users' movements and helping them to perform a specific movement consistently over time. Applications span from movement-based interaction techniques to auditory-guided rehabilitation. We first describe our system and report a study that demonstrates a 'stabilizing effect' of our adaptive auditory feedback method.","PeriodicalId":285547,"journal":{"name":"Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2851581.2892420","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

Abstract

We introduce SoundGuides, a user adaptable tool for auditory feedback on movement. The system is based on a interactive machine learning approach, where both gestures and sounds are first conjointly designed and conjointly learned by the system. The system can then automatically adapt the auditory feedback to any new user, taking into account the particular way each user performs a given gesture. SoundGuides is suitable for the design of continuous auditory feedback aimed at guiding users' movements and helping them to perform a specific movement consistently over time. Applications span from movement-based interaction techniques to auditory-guided rehabilitation. We first describe our system and report a study that demonstrates a 'stabilizing effect' of our adaptive auditory feedback method.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
声音指南:适应连续的听觉反馈给用户
我们介绍SoundGuides,一个用户适应的工具,用于对运动进行听觉反馈。该系统基于交互式机器学习方法,手势和声音首先由系统共同设计和共同学习。然后,系统可以自动调整听觉反馈,以适应任何新用户,考虑到每个用户执行给定手势的特定方式。SoundGuides适合于设计持续的听觉反馈,旨在指导用户的动作,并帮助他们在一段时间内始终如一地执行特定的动作。应用范围从基于运动的交互技术到听觉引导的康复。我们首先描述了我们的系统,并报告了一项研究,证明了我们的适应性听觉反馈方法的“稳定效应”。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A System Modeling Based Anti-Shake Technique for Mobile Display My Scrawl Hides It All: Protecting Text Messages Against Shoulder Surfing With Handwritten Fonts CustomConsole: A Framework for Supporting Cross-device Videogames For Richer, for Poorer, in Sickness or in Health...: The Long-Term Management of Personal Information Exploring Haptics for Learning Bend Gestures for the Blind
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1