为了更好地理解人-机器人物体转移中涉及的交流线索

Mamoun Gharbi, Pierre-Vincent Paubel, A. Clodic, O. Carreras, R. Alami, J. Cellier
{"title":"为了更好地理解人-机器人物体转移中涉及的交流线索","authors":"Mamoun Gharbi, Pierre-Vincent Paubel, A. Clodic, O. Carreras, R. Alami, J. Cellier","doi":"10.1109/ROMAN.2015.7333626","DOIUrl":null,"url":null,"abstract":"Handing-over objects to humans (or taking objects from them) is a key capability for a service robot. Humans are efficient and natural while performing this action and the purpose of the studies on this topic is to bring human-robot handovers to an acceptable, efficient and natural level. This paper deals with the cues that allow to make a handover look as natural as possible, and more precisely we focus on where the robot should look while performing it. In this context we propose a user study, involving 33 volunteers, who judged video sequences where they see either a human or a robot giving them an object. They were presented with different sequences where the agents (robot or human) have different gaze behaviours, and were asked to give their feeling about the sequence naturalness. In addition to this subjective measure, the volunteers were equipped with an eye tracker which enabled us to have more accurate objective measures.","PeriodicalId":119467,"journal":{"name":"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":"{\"title\":\"Toward a better understanding of the communication cues involved in a human-robot object transfer\",\"authors\":\"Mamoun Gharbi, Pierre-Vincent Paubel, A. Clodic, O. Carreras, R. Alami, J. Cellier\",\"doi\":\"10.1109/ROMAN.2015.7333626\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Handing-over objects to humans (or taking objects from them) is a key capability for a service robot. Humans are efficient and natural while performing this action and the purpose of the studies on this topic is to bring human-robot handovers to an acceptable, efficient and natural level. This paper deals with the cues that allow to make a handover look as natural as possible, and more precisely we focus on where the robot should look while performing it. In this context we propose a user study, involving 33 volunteers, who judged video sequences where they see either a human or a robot giving them an object. They were presented with different sequences where the agents (robot or human) have different gaze behaviours, and were asked to give their feeling about the sequence naturalness. In addition to this subjective measure, the volunteers were equipped with an eye tracker which enabled us to have more accurate objective measures.\",\"PeriodicalId\":119467,\"journal\":{\"name\":\"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-11-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"25\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROMAN.2015.7333626\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.2015.7333626","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25

摘要

将物品交给人类(或从人类手中拿走物品)是服务机器人的一项关键能力。人类在进行这一动作时是高效和自然的,本课题研究的目的是使人与机器人的交接达到可接受的、高效的和自然的水平。这篇论文处理的线索使交接看起来尽可能自然,更准确地说,我们关注的是机器人在执行交接时应该看的地方。在此背景下,我们提出了一项用户研究,涉及33名志愿者,他们判断视频序列,他们看到一个人或一个机器人给他们一个物体。他们被展示了不同的序列,其中代理(机器人或人类)有不同的凝视行为,并被要求给出他们对序列自然性的感觉。除了这种主观测量之外,志愿者们还配备了眼动仪,使我们能够有更准确的客观测量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Toward a better understanding of the communication cues involved in a human-robot object transfer
Handing-over objects to humans (or taking objects from them) is a key capability for a service robot. Humans are efficient and natural while performing this action and the purpose of the studies on this topic is to bring human-robot handovers to an acceptable, efficient and natural level. This paper deals with the cues that allow to make a handover look as natural as possible, and more precisely we focus on where the robot should look while performing it. In this context we propose a user study, involving 33 volunteers, who judged video sequences where they see either a human or a robot giving them an object. They were presented with different sequences where the agents (robot or human) have different gaze behaviours, and were asked to give their feeling about the sequence naturalness. In addition to this subjective measure, the volunteers were equipped with an eye tracker which enabled us to have more accurate objective measures.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Joint action perception to enable fluent human-robot teamwork Talking-Ally: What is the future of robot's utterance generation? Robot watchfulness hinders learning performance Floor estimation by a wearable travel aid for visually impaired A survey report on information costs in introducing technology to care services for older adults
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1