The Sound of Actuators: Disturbance in Human - Robot Interactions?

Mélanie Jouaiti, P. Hénaff
{"title":"The Sound of Actuators: Disturbance in Human - Robot Interactions?","authors":"Mélanie Jouaiti, P. Hénaff","doi":"10.1109/DEVLRN.2019.8850697","DOIUrl":null,"url":null,"abstract":"Human-Robot interactions promise to increase as robots become more pervasive. One important aspect is gestural communication which is quite popular in rehabilitation and therapeutic robotics. Indeed, synchrony is a key component of interpersonal interactions which affects the interaction on the behavioural level, as well as on the social level. When interacting physically with a robot, one perceives the robot movements but robot actuators also produce sound. In this work, we wonder whether the sound of actuators can hamper human coordination in human-robot rhythmic interactions. Indeed, the human brain processes the auditory input in priority compared to the visual input. This property can sometimes be so powerful so as to alter or even remove the visual perception. However, under given circumstances, the auditory signal and the visual perception can reinforce each other. In this paper, we propose a study where participants were asked to perform a waving-like gesture back at a robot in three different conditions: with visual perception only, auditory perception only and both perceptions. We analyze coordination performance and focus of gaze in each condition. Results show that the combination of visual and auditory perceptions perturbs the rhythmic interaction.","PeriodicalId":318973,"journal":{"name":"2019 Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DEVLRN.2019.8850697","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Human-Robot interactions promise to increase as robots become more pervasive. One important aspect is gestural communication which is quite popular in rehabilitation and therapeutic robotics. Indeed, synchrony is a key component of interpersonal interactions which affects the interaction on the behavioural level, as well as on the social level. When interacting physically with a robot, one perceives the robot movements but robot actuators also produce sound. In this work, we wonder whether the sound of actuators can hamper human coordination in human-robot rhythmic interactions. Indeed, the human brain processes the auditory input in priority compared to the visual input. This property can sometimes be so powerful so as to alter or even remove the visual perception. However, under given circumstances, the auditory signal and the visual perception can reinforce each other. In this paper, we propose a study where participants were asked to perform a waving-like gesture back at a robot in three different conditions: with visual perception only, auditory perception only and both perceptions. We analyze coordination performance and focus of gaze in each condition. Results show that the combination of visual and auditory perceptions perturbs the rhythmic interaction.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
致动器的声音:人-机器人交互中的干扰?
随着机器人的普及,人机互动有望增加。一个重要的方面是手势交流,这在康复和治疗机器人中非常流行。事实上,同步性是人际互动的一个关键组成部分,它影响着行为层面的互动,也影响着社会层面的互动。当与机器人进行物理互动时,人们会感知机器人的运动,但机器人驱动器也会产生声音。在这项工作中,我们想知道执行器的声音是否会妨碍人类在人机节奏交互中的协调。的确,与视觉输入相比,人类大脑优先处理听觉输入。这种属性有时会非常强大,以至于改变甚至消除视觉感知。然而,在一定的情况下,听觉信号和视觉感知可以相互加强。在本文中,我们提出了一项研究,要求参与者在三种不同的条件下对机器人做出类似挥手的手势:只有视觉感知、只有听觉感知和两种感知。我们分析了在每种情况下的协调性能和凝视焦点。结果表明,视觉和听觉的结合干扰了节律性的相互作用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Training-ValueNet: Data Driven Label Noise Cleaning on Weakly-Supervised Web Images Learning to Parse Grounded Language using Reservoir Computing Identifying Reusable Early-Life Options New evidence for learning-based accounts of gaze following: Testing a robotic prediction Online Associative Multi-Stage Goal Babbling Toward Versatile Learning of Sensorimotor Skills
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1