首页 > 最新文献

2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)最新文献

英文 中文
Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) 人机交互的虚拟、增强和混合现实(VAM-HRI)
Pub Date : 2019-03-01 DOI: 10.1145/3371382.3374850
T. Williams, D. Szafir, T. Chakraborti, H. B. Amor
The 2 nd International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) will bring together HRI, Robotics, and Mixed Reality researchers to identify challenges in mixed reality interactions between humans and robots. Topics relevant to the workshop include development of robots that can interact with humans in mixed reality, use of virtual reality for developing interactive robots, the design of new augmented reality interfaces that mediate communication between humans and robots, comparisons of the capabilities and perceptions of robots and virtual agents, and best design practices. VAM-HRI was held for the first time at HRI 2018, where it served as the first workshop of its kind at an academic AI or Robotics conference, and served as a timely call to arms to the academic community in response to the growing promise of this emerging field. VAM-HRI 2019 will follow on the success of VAM-HRI 2018, and present new opportunities for expanding this nascent research community. Website http://vam-hri.xyz/
第二届人机交互虚拟、增强和混合现实国际研讨会(VAM-HRI)将汇集HRI、机器人技术和混合现实研究人员,以确定人与机器人之间混合现实交互的挑战。与研讨会相关的主题包括开发可以在混合现实中与人类互动的机器人,使用虚拟现实开发交互式机器人,设计新的增强现实界面,调解人与机器人之间的通信,比较机器人和虚拟代理的能力和感知,以及最佳设计实践。VAM-HRI首次在2018年HRI上举行,作为人工智能或机器人学术会议上的第一个此类研讨会,并及时向学术界发出了呼吁,以应对这一新兴领域日益增长的前景。VAM-HRI 2019将延续VAM-HRI 2018的成功,并为扩大这一新兴的研究社区提供新的机会。网站http://vam-hri.xyz/
{"title":"Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)","authors":"T. Williams, D. Szafir, T. Chakraborti, H. B. Amor","doi":"10.1145/3371382.3374850","DOIUrl":"https://doi.org/10.1145/3371382.3374850","url":null,"abstract":"The 2 nd International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) will bring together HRI, Robotics, and Mixed Reality researchers to identify challenges in mixed reality interactions between humans and robots. Topics relevant to the workshop include development of robots that can interact with humans in mixed reality, use of virtual reality for developing interactive robots, the design of new augmented reality interfaces that mediate communication between humans and robots, comparisons of the capabilities and perceptions of robots and virtual agents, and best design practices. VAM-HRI was held for the first time at HRI 2018, where it served as the first workshop of its kind at an academic AI or Robotics conference, and served as a timely call to arms to the academic community in response to the growing promise of this emerging field. VAM-HRI 2019 will follow on the success of VAM-HRI 2018, and present new opportunities for expanding this nascent research community. Website http://vam-hri.xyz/","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"20 1","pages":"671-672"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82813143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 51
Learning from Corrective Demonstrations 从纠正演示中学习
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673287
R. Gutierrez, Elaine Schaertl Short, S. Niekum, A. Thomaz
Robots deployed in human environments will inevitably encounter unmodeled scenarios which are likely to result in execution failures. To address this issue, we would like to allow co-present naive users to correct and improve the robot's behavior as these edge cases are encountered over time.
部署在人类环境中的机器人将不可避免地遇到未建模的场景,这可能导致执行失败。为了解决这个问题,我们希望允许共同在场的天真用户纠正和改进机器人的行为,因为这些边缘情况会随着时间的推移而遇到。
{"title":"Learning from Corrective Demonstrations","authors":"R. Gutierrez, Elaine Schaertl Short, S. Niekum, A. Thomaz","doi":"10.1109/HRI.2019.8673287","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673287","url":null,"abstract":"Robots deployed in human environments will inevitably encounter unmodeled scenarios which are likely to result in execution failures. To address this issue, we would like to allow co-present naive users to correct and improve the robot's behavior as these edge cases are encountered over time.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"28 1","pages":"712-714"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83604735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Supplementary Material for Characterizing Input Methods for Human-to-Robot Demonstrations 描述人对机器人演示的输入方法的补充材料
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673328
Pragathi Praveena, G. Subramani, Bilge Mutlu, Michael Gleicher
In this section, we discuss some extensions of Section III and expand on the limitations mentioned in Section VI of the main article.
在本节中,我们将讨论第三节的一些扩展,并扩展主要文章第六节中提到的限制。
{"title":"Supplementary Material for Characterizing Input Methods for Human-to-Robot Demonstrations","authors":"Pragathi Praveena, G. Subramani, Bilge Mutlu, Michael Gleicher","doi":"10.1109/HRI.2019.8673328","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673328","url":null,"abstract":"In this section, we discuss some extensions of Section III and expand on the limitations mentioned in Section VI of the main article.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"53 1","pages":"1-3"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83609994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Analysis of User Interaction While Walking on Slopes Using Robotic Rollators – Pilot Study 使用机器人滚轮在斜坡上行走时的用户交互分析-试点研究
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673136
Sungyong Lee, Suncheol Kwon
Robotic rollators (walkers with motor) require appropriate assist performance to improve the quality of life of the elderly and disabled. In this pilot study, the performance of commercial robotic rollators was evaluated by analyzing the effect of load condition on muscle activities and knee and hip joint angle when walking on a slope. Results showed that, the highest muscle activity occurred under a load of 5 kg but there was no difference in joint angle according to load condition. In future studies, we plan to test the performance with many subjects. In addition, we will perform motion analysis and muscle activity analysis according to changes in assist, brake, and speed levels.
机器人滚轮(带马达的步行器)需要适当的辅助性能,以提高老年人和残疾人的生活质量。在本初步研究中,通过分析在斜坡上行走时负载条件对肌肉活动和膝关节、髋关节角度的影响,评估了商用机器人滚轴的性能。结果表明,5 kg负荷下肌肉活动最高,但不同负荷条件下关节角度无差异。在未来的研究中,我们计划对更多的受试者进行测试。此外,我们将根据辅助、制动和速度水平的变化进行运动分析和肌肉活动分析。
{"title":"Analysis of User Interaction While Walking on Slopes Using Robotic Rollators – Pilot Study","authors":"Sungyong Lee, Suncheol Kwon","doi":"10.1109/HRI.2019.8673136","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673136","url":null,"abstract":"Robotic rollators (walkers with motor) require appropriate assist performance to improve the quality of life of the elderly and disabled. In this pilot study, the performance of commercial robotic rollators was evaluated by analyzing the effect of load condition on muscle activities and knee and hip joint angle when walking on a slope. Results showed that, the highest muscle activity occurred under a load of 5 kg but there was no difference in joint angle according to load condition. In future studies, we plan to test the performance with many subjects. In addition, we will perform motion analysis and muscle activity analysis according to changes in assist, brake, and speed levels.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"28 1","pages":"662-663"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83852935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Measurement of Moral Concern for Robots 对机器人的道德关注的测量
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673095
T. Nomura, T. Kanda, Sachie Yamada
We developed a self-report measurement, Moral Concern for Robots Scale (MCRS), which measures whether people believe that a robot has moral standing, deserves moral care, and merits protection. The results of an online survey ($pmb{N}= 200$) confirmed the concurrent validity and predictive validity of the scale in the sense that the scale scores are successfully used to predict people's intentions for prosocial behaviors.
我们开发了一个自我报告测量,机器人道德关注量表(MCRS),它衡量人们是否认为机器人有道德地位,值得道德关怀,值得保护。一项在线调查($pmb{N}= 200$)的结果证实了量表的并发效度和预测效度,即量表得分成功地用于预测人们的亲社会行为意图。
{"title":"Measurement of Moral Concern for Robots","authors":"T. Nomura, T. Kanda, Sachie Yamada","doi":"10.1109/HRI.2019.8673095","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673095","url":null,"abstract":"We developed a self-report measurement, Moral Concern for Robots Scale (MCRS), which measures whether people believe that a robot has moral standing, deserves moral care, and merits protection. The results of an online survey ($pmb{N}= 200$) confirmed the concurrent validity and predictive validity of the scale in the sense that the scale scores are successfully used to predict people's intentions for prosocial behaviors.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"14 1","pages":"540-541"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89841895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Remote Supervision of an Unmanned Surface Vessel - A Comparison of Interfaces 无人水面舰艇的远程监控——接口的比较
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673100
M. Lager, E. A. Topp, J. Malec
We compared three different Graphical User Interfaces (GUI) that we have designed and implemented to enable human supervision of an unmanned ship. Our findings indicate that a 3D GUI presented either on a screen or in a Virtual Reality (VR) setting provides several objective and subjective benefits compared to a Baseline GUI representing traditional tools.
我们比较了我们设计和实现的三种不同的图形用户界面(GUI),以实现无人驾驶船的人类监督。我们的研究结果表明,与代表传统工具的基线GUI相比,在屏幕上或虚拟现实(VR)设置中呈现的3D GUI提供了几个客观和主观的好处。
{"title":"Remote Supervision of an Unmanned Surface Vessel - A Comparison of Interfaces","authors":"M. Lager, E. A. Topp, J. Malec","doi":"10.1109/HRI.2019.8673100","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673100","url":null,"abstract":"We compared three different Graphical User Interfaces (GUI) that we have designed and implemented to enable human supervision of an unmanned ship. Our findings indicate that a 3D GUI presented either on a screen or in a Virtual Reality (VR) setting provides several objective and subjective benefits compared to a Baseline GUI representing traditional tools.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"40 2","pages":"546-547"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91472870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Collaborative User Responses in Multiparty Interaction with a Couples Counselor Robot 夫妻咨询机器人在多方交互中的协同用户响应
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673177
Dina Utami, T. Bickmore
Intimate relationships are integral parts of human societies, yet many relationships are in distress. Couples counseling has been shown to be effective in preventing and alleviating relationship distress, yet many couples do not seek professional help, due to cost, logistic, and discomfort in disclosing private problems. In this paper, we describe our efforts towards the development a fully automated couples counselor robot, and focus specifically on the problem of identifying and processing “collaborative responses”, in which a human couple co-construct a response to a query from the robot. We present an analysis of collaborative responses obtained from a pilot study, then develop a data-driven model to detect end of collaborative responses for regulating turn taking during a counseling session. Our model uses a combination of multimodal features, and achieves an offline weighted F-score of 0.81. Finally, we present findings from a quasi-experimental study with a robot facilitating a counseling session to promote intimacy with romantic couples. Our findings suggest that the session improves couples intimacy and positive affect. An online evaluation of the end-of-collaborative-response model demonstrates an F-score of 0.72.
亲密关系是人类社会不可分割的一部分,然而许多关系都处于困境。夫妻咨询已被证明在预防和减轻关系困扰方面是有效的,但由于费用、后勤和披露私人问题时的不适,许多夫妻不寻求专业帮助。在本文中,我们描述了我们为开发全自动夫妻咨询机器人所做的努力,并特别关注识别和处理“协作响应”的问题,其中人类夫妇共同构建对机器人查询的响应。我们提出了从试点研究中获得的协作响应的分析,然后开发了一个数据驱动的模型,以检测在咨询会议期间调节轮流的协作响应的结束。我们的模型使用了多模态特征的组合,并实现了0.81的离线加权f分数。最后,我们介绍了一项准实验研究的结果,该研究使用机器人促进咨询会议,以促进与浪漫伴侣的亲密关系。我们的研究结果表明,这段时间可以改善夫妻之间的亲密关系,并产生积极的影响。对合作结束反应模型的在线评估显示f得分为0.72。
{"title":"Collaborative User Responses in Multiparty Interaction with a Couples Counselor Robot","authors":"Dina Utami, T. Bickmore","doi":"10.1109/HRI.2019.8673177","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673177","url":null,"abstract":"Intimate relationships are integral parts of human societies, yet many relationships are in distress. Couples counseling has been shown to be effective in preventing and alleviating relationship distress, yet many couples do not seek professional help, due to cost, logistic, and discomfort in disclosing private problems. In this paper, we describe our efforts towards the development a fully automated couples counselor robot, and focus specifically on the problem of identifying and processing “collaborative responses”, in which a human couple co-construct a response to a query from the robot. We present an analysis of collaborative responses obtained from a pilot study, then develop a data-driven model to detect end of collaborative responses for regulating turn taking during a counseling session. Our model uses a combination of multimodal features, and achieves an offline weighted F-score of 0.81. Finally, we present findings from a quasi-experimental study with a robot facilitating a counseling session to promote intimacy with romantic couples. Our findings suggest that the session improves couples intimacy and positive affect. An online evaluation of the end-of-collaborative-response model demonstrates an F-score of 0.72.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"22 1","pages":"294-303"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83924172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
Heterogeneous Learning from Demonstration 从示范中异质学习
Pub Date : 2019-03-01 DOI: 10.1109/hri.2019.8673267
Rohan R. Paleja, M. Gombolay
The development of human-robot systems able to leverage the strengths of both humans and their robotic counterparts has been greatly sought after because of the foreseen, broad-ranging impact across industry and research. We believe the true potential of these systems cannot be reached unless the robot is able to act with a high level of autonomy, reducing the burden of manual tasking or teleoperation. To achieve this level of autonomy, robots must be able to work fluidly with its human partners, inferring their needs without explicit commands. This inference requires the robot to be able to detect and classify the heterogeneity of its partners. We propose a framework for learning from heterogeneous demonstration based upon Bayesian inference and evaluate a suite of approaches on a real-world dataset of gameplay from StarCraft II. This evaluation provides evidence that our Bayesian approach can outperform conventional methods by up to 12.8%.
由于对工业和研究的广泛影响,能够利用人类和机器人的优势的人-机器人系统的发展受到了极大的追捧。我们认为,除非机器人能够高度自主地行动,减少手动任务或远程操作的负担,否则这些系统的真正潜力无法实现。为了达到这种程度的自主性,机器人必须能够与人类伙伴流畅地合作,在没有明确命令的情况下推断他们的需求。这种推断要求机器人能够检测和分类其伙伴的异质性。我们提出了一个基于贝叶斯推理的异构演示学习框架,并在《星际争霸2》的真实游戏玩法数据集上评估了一系列方法。这一评价提供了证据,表明我们的贝叶斯方法比传统方法的性能高出12.8%。
{"title":"Heterogeneous Learning from Demonstration","authors":"Rohan R. Paleja, M. Gombolay","doi":"10.1109/hri.2019.8673267","DOIUrl":"https://doi.org/10.1109/hri.2019.8673267","url":null,"abstract":"The development of human-robot systems able to leverage the strengths of both humans and their robotic counterparts has been greatly sought after because of the foreseen, broad-ranging impact across industry and research. We believe the true potential of these systems cannot be reached unless the robot is able to act with a high level of autonomy, reducing the burden of manual tasking or teleoperation. To achieve this level of autonomy, robots must be able to work fluidly with its human partners, inferring their needs without explicit commands. This inference requires the robot to be able to detect and classify the heterogeneity of its partners. We propose a framework for learning from heterogeneous demonstration based upon Bayesian inference and evaluate a suite of approaches on a real-world dataset of gameplay from StarCraft II. This evaluation provides evidence that our Bayesian approach can outperform conventional methods by up to 12.8%.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"1 1","pages":"730-732"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88671787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Comparing Human-Robot Proxemics Between Virtual Reality and the Real World 虚拟现实与现实世界的人机比较法
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673116
Rui Li, Marc van Almkerk, S. V. Waveren, E. Carter, Iolanda Leite
Virtual Reality (VR) can greatly benefit Human-Robot Interaction (HRI) as a tool to effectively iterate across robot designs. However, possible system limitations of VR could influence the results such that they do not fully reflect real-life encounters with robots. In order to better deploy VR in HRI, we need to establish a basic understanding of what the differences are between HRI studies in the real world and in VR. This paper investigates the differences between the real life and VR with a focus on proxemic preferences, in combination with exploring the effects of visual familiarity and spatial sound within the VR experience. Results suggested that people prefer closer interaction distances with a real, physical robot than with a virtual robot in VR. Additionally, the virtual robot was perceived as more discomforting than the real robot, which could result in the differences in proxemics. Overall, these results indicate that the perception of the robot has to be evaluated before the interaction can be studied. However, the results also suggested that VR settings with different visual familiarities are consistent with each other in how they affect HRI proxemics and virtual robot perceptions, indicating the freedom to study HRI in various scenarios in VR. The effect of spatial sound in VR drew a more complex picture and thus calls for more in-depth research to understand its influence on HRI in VR.
虚拟现实(VR)作为一种有效迭代机器人设计的工具,可以极大地促进人机交互(HRI)。然而,虚拟现实可能存在的系统限制可能会影响结果,使它们不能完全反映现实生活中与机器人的接触。为了更好地在HRI中部署VR,我们需要对现实世界中的HRI研究与VR中的HRI研究之间的差异有一个基本的了解。本文研究了现实生活与VR之间的差异,重点研究了近距离偏好,并结合VR体验中视觉熟悉度和空间声音的影响。研究结果表明,与虚拟现实中的虚拟机器人相比,人们更喜欢与真实的物理机器人进行更近距离的互动。此外,虚拟机器人被认为比真实机器人更令人不舒服,这可能会导致近身学的差异。总的来说,这些结果表明,在研究相互作用之前,必须评估机器人的感知。然而,研究结果也表明,不同视觉熟悉度的VR设置对HRI近似值和虚拟机器人感知的影响是一致的,这表明在VR的各种场景中研究HRI是自由的。空间声音在VR中的作用更为复杂,需要更深入的研究来了解其对VR中HRI的影响。
{"title":"Comparing Human-Robot Proxemics Between Virtual Reality and the Real World","authors":"Rui Li, Marc van Almkerk, S. V. Waveren, E. Carter, Iolanda Leite","doi":"10.1109/HRI.2019.8673116","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673116","url":null,"abstract":"Virtual Reality (VR) can greatly benefit Human-Robot Interaction (HRI) as a tool to effectively iterate across robot designs. However, possible system limitations of VR could influence the results such that they do not fully reflect real-life encounters with robots. In order to better deploy VR in HRI, we need to establish a basic understanding of what the differences are between HRI studies in the real world and in VR. This paper investigates the differences between the real life and VR with a focus on proxemic preferences, in combination with exploring the effects of visual familiarity and spatial sound within the VR experience. Results suggested that people prefer closer interaction distances with a real, physical robot than with a virtual robot in VR. Additionally, the virtual robot was perceived as more discomforting than the real robot, which could result in the differences in proxemics. Overall, these results indicate that the perception of the robot has to be evaluated before the interaction can be studied. However, the results also suggested that VR settings with different visual familiarities are consistent with each other in how they affect HRI proxemics and virtual robot perceptions, indicating the freedom to study HRI in various scenarios in VR. The effect of spatial sound in VR drew a more complex picture and thus calls for more in-depth research to understand its influence on HRI in VR.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"102 6","pages":"431-439"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91473930","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Children's Responding to Humanlike Agents Reflects an Uncanny Valley 儿童对类人代理的反应反映了一个恐怖谷
Pub Date : 2019-03-01 DOI: 10.1109/HRI.2019.8673088
M. Strait, Heather L. Urry, P. Muentener
Both perceptual mechanisms (e.g., threat detection/avoidance) and social mechanisms (e.g., fears fostered via negative media) may explain the existence of the uncanny valley; however, existing literature lacks sufficient evidence to decide whether one, the other, or a combination best accounts for the valley's effects. As perceptually oriented explanations imply the valley should be evident early in development, we investigated whether it presents in the responding of children ($N=80$; ages 5–10) to agents of varying human similarity. We found that, like adults, children were most averse to highly humanlike robots (relative to less humanlike robots and humans). But, unlike adults, children's aversion did not translate to avoidance. The findings thus indicate, consistent with perceptual explanations, that the valley effect manifests well before adulthood. However, further research is needed to understand the emergence of the valley's behavioral consequences.
知觉机制(例如,威胁检测/回避)和社会机制(例如,通过负面媒体培养的恐惧)都可以解释恐怖谷的存在;然而,现有文献缺乏足够的证据来决定是其中一种,另一种,还是两者的结合最能解释山谷的影响。由于知觉导向的解释暗示山谷应该在发育早期就很明显,我们调查了它是否出现在儿童的反应中(N=80;年龄5-10岁)到不同人类相似度的代理人。我们发现,和成年人一样,孩子们最讨厌非常像人类的机器人(相对于不太像人类的机器人和人类而言)。但是,与成年人不同的是,孩子们的厌恶并没有转化为回避。因此,研究结果表明,与感性解释一致,山谷效应在成年之前就表现出来了。然而,需要进一步的研究来了解山谷的出现对行为的影响。
{"title":"Children's Responding to Humanlike Agents Reflects an Uncanny Valley","authors":"M. Strait, Heather L. Urry, P. Muentener","doi":"10.1109/HRI.2019.8673088","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673088","url":null,"abstract":"Both perceptual mechanisms (e.g., threat detection/avoidance) and social mechanisms (e.g., fears fostered via negative media) may explain the existence of the uncanny valley; however, existing literature lacks sufficient evidence to decide whether one, the other, or a combination best accounts for the valley's effects. As perceptually oriented explanations imply the valley should be evident early in development, we investigated whether it presents in the responding of children ($N=80$; ages 5–10) to agents of varying human similarity. We found that, like adults, children were most averse to highly humanlike robots (relative to less humanlike robots and humans). But, unlike adults, children's aversion did not translate to avoidance. The findings thus indicate, consistent with perceptual explanations, that the valley effect manifests well before adulthood. However, further research is needed to understand the emergence of the valley's behavioral consequences.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"41 1","pages":"506-515"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81872402","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
期刊
2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1