Perception of gaze direction for situated interaction

Samer Al Moubayed, Gabriel Skantze
{"title":"Perception of gaze direction for situated interaction","authors":"Samer Al Moubayed, Gabriel Skantze","doi":"10.1145/2401836.2401839","DOIUrl":null,"url":null,"abstract":"Accurate human perception of robots' gaze direction is crucial for the design of a natural and fluent situated multimodal face-to-face interaction between humans and machines. In this paper, we present an experiment targeted at quantifying the effects of different gaze cues synthesized using the Furhat back-projected robot head, on the accuracy of perceived spatial direction of gaze by humans using 18 test subjects. The study first quantifies the accuracy of the perceived gaze direction in a human-human setup, and compares that to the use of synthesized gaze movements in different conditions: viewing the robot eyes frontal or at a 45 degrees angle side view. We also study the effect of 3D gaze by controlling both eyes to indicate the depth of the focal point (vergence), the use of gaze or head pose, and the use of static or dynamic eyelids. The findings of the study are highly relevant to the design and control of robots and animated agents in situated face-to-face interaction.","PeriodicalId":272657,"journal":{"name":"Gaze-In '12","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"23","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Gaze-In '12","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2401836.2401839","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 23

Abstract

Accurate human perception of robots' gaze direction is crucial for the design of a natural and fluent situated multimodal face-to-face interaction between humans and machines. In this paper, we present an experiment targeted at quantifying the effects of different gaze cues synthesized using the Furhat back-projected robot head, on the accuracy of perceived spatial direction of gaze by humans using 18 test subjects. The study first quantifies the accuracy of the perceived gaze direction in a human-human setup, and compares that to the use of synthesized gaze movements in different conditions: viewing the robot eyes frontal or at a 45 degrees angle side view. We also study the effect of 3D gaze by controlling both eyes to indicate the depth of the focal point (vergence), the use of gaze or head pose, and the use of static or dynamic eyelids. The findings of the study are highly relevant to the design and control of robots and animated agents in situated face-to-face interaction.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
情境互动中凝视方向的感知
人类对机器人注视方向的准确感知对于设计自然流畅的多模态人机交互至关重要。在本文中,我们提出了一项实验,旨在量化使用Furhat反向投影机器人头部合成的不同凝视线索对人类感知凝视空间方向准确性的影响。该研究首先量化了在人-人设置中感知到的凝视方向的准确性,并将其与在不同条件下使用的合成凝视运动进行了比较:观看机器人眼睛的正面或45度角的侧面视图。我们还研究了3D凝视的效果,通过控制双眼来指示焦点的深度(聚焦点),使用凝视或头部姿势,以及使用静态或动态眼睑。该研究结果与机器人和动画代理在情境面对面互动中的设计和控制高度相关。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Visual interaction and conversational activity Eye gaze assisted human-computer interaction in a hand gesture controlled multi-display environment A head-eye coordination model for animating gaze shifts of virtual characters Sensing visual attention using an interactive bidirectional HMD Multi-modal object of interest detection using eye gaze and RGB-D cameras
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1