Affective handshake with a humanoid robot: How do participants perceive and combine its facial and haptic expressions?

Mohamed Yacine Tsalamlal, Jean-Claude Martin, M. Ammi, A. Tapus, M. Amorim
{"title":"Affective handshake with a humanoid robot: How do participants perceive and combine its facial and haptic expressions?","authors":"Mohamed Yacine Tsalamlal, Jean-Claude Martin, M. Ammi, A. Tapus, M. Amorim","doi":"10.1109/ACII.2015.7344592","DOIUrl":null,"url":null,"abstract":"This study presents an experiment highlighting how participants combine facial expressions and haptic feedback to perceive emotions when interacting with an expressive humanoid robot. Participants were asked to interact with the humanoid robot through a handshake behavior while looking at its facial expressions. Experimental data were examined within the information integration theory framework. Results revealed that participants combined Facial and Haptic cues additively to evaluate the Valence, Arousal, and Dominance dimensions. The relative importance of each modality was different across the emotional dimensions. Participants gave more importance to facial expressions when evaluating Valence. They gave more importance to haptic feedback when evaluating Arousal and Dominance.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"7 1","pages":"334-340"},"PeriodicalIF":0.0000,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACII.2015.7344592","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17

Abstract

This study presents an experiment highlighting how participants combine facial expressions and haptic feedback to perceive emotions when interacting with an expressive humanoid robot. Participants were asked to interact with the humanoid robot through a handshake behavior while looking at its facial expressions. Experimental data were examined within the information integration theory framework. Results revealed that participants combined Facial and Haptic cues additively to evaluate the Valence, Arousal, and Dominance dimensions. The relative importance of each modality was different across the emotional dimensions. Participants gave more importance to facial expressions when evaluating Valence. They gave more importance to haptic feedback when evaluating Arousal and Dominance.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
与人形机器人的情感握手:参与者如何感知并结合其面部和触觉表情?
本研究展示了一个实验,突出了参与者如何结合面部表情和触觉反馈来感知情感,当与具有表现力的人形机器人互动时。参与者被要求通过握手行为与人形机器人互动,同时观察机器人的面部表情。在信息集成理论框架下对实验数据进行了检验。结果显示,参与者将面部和触觉线索相加来评估效价、唤醒和优势维度。在情感维度上,每种情态的相对重要性是不同的。在评估效价时,参与者更重视面部表情。在评估唤醒和支配时,他们更重视触觉反馈。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Avatar and participant gender differences in the perception of uncanniness of virtual humans Neural conditional ordinal random fields for agreement level estimation Fundamental frequency modeling using wavelets for emotional voice conversion Bimodal feature-based fusion for real-time emotion recognition in a mobile context Harmony search for feature selection in speech emotion recognition
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1