Can you tell that I’m confused? An overhearer study for German backchannels by an embodied agent

Isabel Donya Meywirth, Jana Götze
{"title":"Can you tell that I’m confused? An overhearer study for German backchannels by an embodied agent","authors":"Isabel Donya Meywirth, Jana Götze","doi":"10.1145/3536220.3558804","DOIUrl":null,"url":null,"abstract":"In spoken interaction, humans constantly display and interpret each others’ state of understanding. For an embodied agent, displaying its internal state of understanding in an efficient manner can be an important means for making a user-interaction more natural and initiate error recovery as early as possible. We carry out an overhearer study with 62 participants to investigate whether German verbal and non-verbal backchannels by the virtual Furhat embodied agent can be interpreted by an overhearer of a human-robot conversation. We compare three positive, three negative, and one neutral feedback reaction. We find that even though it is difficult to generate certain verbal backchannels, our participants can recognize displays of understanding with an accuracy of up to 0.92. Trying to communicate a lack of understanding is more often misunderstood (accuracy: 0.55), meaning that interaction designers need to carefully craft them in order to be useful for the interaction flow.","PeriodicalId":186796,"journal":{"name":"Companion Publication of the 2022 International Conference on Multimodal Interaction","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion Publication of the 2022 International Conference on Multimodal Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3536220.3558804","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In spoken interaction, humans constantly display and interpret each others’ state of understanding. For an embodied agent, displaying its internal state of understanding in an efficient manner can be an important means for making a user-interaction more natural and initiate error recovery as early as possible. We carry out an overhearer study with 62 participants to investigate whether German verbal and non-verbal backchannels by the virtual Furhat embodied agent can be interpreted by an overhearer of a human-robot conversation. We compare three positive, three negative, and one neutral feedback reaction. We find that even though it is difficult to generate certain verbal backchannels, our participants can recognize displays of understanding with an accuracy of up to 0.92. Trying to communicate a lack of understanding is more often misunderstood (accuracy: 0.55), meaning that interaction designers need to carefully craft them in order to be useful for the interaction flow.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
你能看出我很困惑吗?一具身特工对德国秘密渠道的窃听研究
在口头互动中,人类不断地展示和解释彼此的理解状态。对于具身代理来说,以一种有效的方式显示其内部理解状态是使用户交互更自然和尽早启动错误恢复的重要手段。我们对62名参与者进行了一项偷听者研究,以调查虚拟Furhat具体化代理的德语口头和非口头反向通道是否可以被人机对话的偷听者解释。我们比较三个积极的,三个消极的,和一个中立的反馈反应。我们发现,即使很难产生某些口头反向通道,我们的参与者也能以高达0.92的准确率识别理解的表现。试图在缺乏理解的情况下进行交流更容易被误解(准确率:0.55),这意味着交互设计师需要仔细设计它们,以便对交互流程有用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Investigating Transformer Encoders and Fusion Strategies for Speech Emotion Recognition in Emergency Call Center Conversations. Predicting User Confidence in Video Recordings with Spatio-Temporal Multimodal Analytics Towards Automatic Prediction of Non-Expert Perceived Speech Fluency Ratings An Emotional Respiration Speech Dataset Can you tell that I’m confused? An overhearer study for German backchannels by an embodied agent
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1