首页 > 最新文献

ACM Transactions on Human-Robot Interaction最新文献

英文 中文
Trust Estimation for Autonomous Vehicles by Measuring Pedestrian Behavior in VR 基于行人行为的自动驾驶车辆信任估计
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580072
Ryota Masuda, Shintaro Ono, T. Hiraoka, Y. Suda
This study proposes a method to estimate pedestrian trust in an automated vehicle (AV) based on pedestrian behavior. It conducted experiments in a VR environment where an AV approached a crosswalk. Participants rated their trust in the AV at three levels before/while they crossed the road. The level can be estimated by deep learning using their skeletal coordinates, position, vehicle position, and speed during the past four seconds. The estimation accuracy was 61%.
本文提出了一种基于行人行为的自动驾驶车辆行人信任度评估方法。它在虚拟现实环境中进行了实验,其中自动驾驶汽车接近人行横道。参与者在过马路前/过马路时将他们对AV的信任程度分为三个等级。通过深度学习,玩家可以利用他们的骨骼坐标、位置、车辆位置和过去4秒内的速度来评估关卡。估计精度为61%。
{"title":"Trust Estimation for Autonomous Vehicles by Measuring Pedestrian Behavior in VR","authors":"Ryota Masuda, Shintaro Ono, T. Hiraoka, Y. Suda","doi":"10.1145/3568294.3580072","DOIUrl":"https://doi.org/10.1145/3568294.3580072","url":null,"abstract":"This study proposes a method to estimate pedestrian trust in an automated vehicle (AV) based on pedestrian behavior. It conducted experiments in a VR environment where an AV approached a crosswalk. Participants rated their trust in the AV at three levels before/while they crossed the road. The level can be estimated by deep learning using their skeletal coordinates, position, vehicle position, and speed during the past four seconds. The estimation accuracy was 61%.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"325 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75047130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hey Robot, Can You Help Me Feel Less Lonely?: An Explorative Study to Examine the Potential of Using Social Robots to Alleviate Loneliness in Young Adults 嘿,机器人,你能帮我减少孤独感吗?一项探索性研究,旨在研究使用社交机器人减轻年轻人孤独感的潜力
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580135
Aike C. Horstmann
An often-forgotten group of people which is heavily affected by loneliness are young adults. The perceived social isolation often stems from attachment insecurities and social skill deficiencies. Since robots can function as social interaction partners who exert less social pressure and display less social complexity, they may pose a promising approach to alleviate this problematic situation. The goal would not be to replace human interaction partners, but to diminish acute loneliness and accompanying detrimental effects and to function as social skills coach and practice interaction partner. To explore the potential of this approach, a preregistered quantitative online study (N = 150) incorporating a video-based interaction with a social robot and qualitative elements was conducted. First results show that young adults report less state loneliness after interacting with the robot than before. Technically affine people evaluate the robot's sociability as well as the interaction with it more positively, people with a general negative attitude towards robots less positively. Furthermore, the more trait loneliness people report to experience, the less sociable they perceive the robot.
一个经常被遗忘的群体是深受孤独影响的年轻人。感知到的社会孤立往往源于依恋不安全感和社交技能不足。由于机器人可以作为社会互动伙伴,施加更少的社会压力,表现出更少的社会复杂性,它们可能会成为缓解这一问题的一种有希望的方法。目标不是取代人类的互动伙伴,而是减少严重的孤独感和伴随的有害影响,并发挥社交技能教练和练习互动伙伴的作用。为了探索这种方法的潜力,进行了一项预先注册的定量在线研究(N = 150),该研究结合了与社交机器人的视频互动和定性元素。第一项研究结果显示,年轻人在与机器人互动后的孤独感比之前有所减少。技术上具有仿射性的人对机器人的社交性以及与机器人的互动给予更积极的评价,而对机器人持消极态度的人则不那么积极。此外,人们报告的孤独感越多,他们认为机器人的社交能力就越差。
{"title":"Hey Robot, Can You Help Me Feel Less Lonely?: An Explorative Study to Examine the Potential of Using Social Robots to Alleviate Loneliness in Young Adults","authors":"Aike C. Horstmann","doi":"10.1145/3568294.3580135","DOIUrl":"https://doi.org/10.1145/3568294.3580135","url":null,"abstract":"An often-forgotten group of people which is heavily affected by loneliness are young adults. The perceived social isolation often stems from attachment insecurities and social skill deficiencies. Since robots can function as social interaction partners who exert less social pressure and display less social complexity, they may pose a promising approach to alleviate this problematic situation. The goal would not be to replace human interaction partners, but to diminish acute loneliness and accompanying detrimental effects and to function as social skills coach and practice interaction partner. To explore the potential of this approach, a preregistered quantitative online study (N = 150) incorporating a video-based interaction with a social robot and qualitative elements was conducted. First results show that young adults report less state loneliness after interacting with the robot than before. Technically affine people evaluate the robot's sociability as well as the interaction with it more positively, people with a general negative attitude towards robots less positively. Furthermore, the more trait loneliness people report to experience, the less sociable they perceive the robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"34 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75079642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of a University Guidance and Information Robot 高校指导信息机器人的研制
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580138
A. Blair, M. Foster
We are developing a social robot that will be deployed in a large, recently-built university building designed for learning and teaching. We outline the design process for this robot, which has included consultations with stakeholders including members of university services, students and other visitors to the building, as well as members of the "Reach Out'' team who normally provide in-person support in the building. These consultations have resulted in a clear specification of the desired robot functionality, which will combine central helpdesk queries with local information about the building and the surrounding university campus. We outline the technical components that will be used to develop the robot system, and also describe how the success of the deployed robot will be evaluated.
我们正在开发一种社交机器人,它将被部署在一座最近建成的大型大学建筑中,该建筑是为学习和教学而设计的。我们概述了这个机器人的设计过程,其中包括与利益相关者的协商,包括大学服务人员、学生和其他建筑物访客,以及通常在建筑物中提供亲自支持的“Reach Out”团队成员。这些磋商产生了对所需机器人功能的明确规范,它将把中央帮助台查询与有关建筑物和周围大学校园的本地信息结合起来。我们概述了将用于开发机器人系统的技术组件,并描述了如何评估部署机器人的成功。
{"title":"Development of a University Guidance and Information Robot","authors":"A. Blair, M. Foster","doi":"10.1145/3568294.3580138","DOIUrl":"https://doi.org/10.1145/3568294.3580138","url":null,"abstract":"We are developing a social robot that will be deployed in a large, recently-built university building designed for learning and teaching. We outline the design process for this robot, which has included consultations with stakeholders including members of university services, students and other visitors to the building, as well as members of the \"Reach Out'' team who normally provide in-person support in the building. These consultations have resulted in a clear specification of the desired robot functionality, which will combine central helpdesk queries with local information about the building and the surrounding university campus. We outline the technical components that will be used to develop the robot system, and also describe how the success of the deployed robot will be evaluated.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"24 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82191215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Implications of AI Bias in HRI: Risks (and Opportunities) when Interacting with a Biased Robot 人工智能偏见在HRI中的含义:与有偏见的机器人互动时的风险(和机遇)
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568162.3576977
Tom Hitron, Noa Morag Yaar, H. Erel
Social robotic behavior is commonly designed using AI algorithms which are trained on human behavioral data. This training process may result in robotic behaviors that echo human biases and stereotypes. In this work, we evaluated whether an interaction with a biased robotic object can increase participants' stereotypical thinking. In the study, a gender-biased robot moderated debates between two participants (man and woman) in three conditions: (1) The robot's behavior matched gender stereotypes (Pro-Man); (2) The robot's behavior countered gender stereotypes (Pro-Woman); (3) The robot's behavior did not reflect gender stereotypes and did not counter them (No-Preference). Quantitative and qualitative measures indicated that the interaction with the robot in the Pro-Man condition increased participants' stereotypical thinking. In the No-Preference condition, stereotypical thinking was also observed but to a lesser extent. In contrast, when the robot displayed counter-biased behavior in the Pro-Woman condition, stereotypical thinking was eliminated. Our findings suggest that HRI designers must be conscious of AI algorithmic biases, as interactions with biased robots can reinforce implicit stereotypical thinking and exacerbate existing biases in society. On the other hand, counter-biased robotic behavior can be leveraged to support present efforts to address the negative impact of stereotypical thinking.
社交机器人的行为通常是用人工智能算法设计的,这些算法是根据人类行为数据训练的。这种训练过程可能会导致机器人的行为与人类的偏见和刻板印象相呼应。在这项工作中,我们评估了与有偏见的机器人物体的互动是否会增加参与者的刻板思维。在研究中,一个性别偏见的机器人在三种情况下主持两名参与者(男性和女性)之间的辩论:(1)机器人的行为符合性别刻板印象(亲男);(2)机器人的行为打破了性别刻板印象(Pro-Woman);(3)机器人的行为不反映性别刻板印象,也不对抗性别刻板印象(No-Preference)。定量和定性测量表明,亲人条件下与机器人的互动增加了被试的刻板思维。在无偏好条件下,也观察到刻板思维,但程度较轻。相比之下,当机器人在亲女性条件下表现出反偏见行为时,刻板印象被消除了。我们的研究结果表明,人力资源研究所的设计者必须意识到人工智能算法的偏见,因为与有偏见的机器人的互动会强化内隐的刻板思维,加剧社会中现有的偏见。另一方面,反偏见的机器人行为可以用来支持当前解决刻板印象思维的负面影响的努力。
{"title":"Implications of AI Bias in HRI: Risks (and Opportunities) when Interacting with a Biased Robot","authors":"Tom Hitron, Noa Morag Yaar, H. Erel","doi":"10.1145/3568162.3576977","DOIUrl":"https://doi.org/10.1145/3568162.3576977","url":null,"abstract":"Social robotic behavior is commonly designed using AI algorithms which are trained on human behavioral data. This training process may result in robotic behaviors that echo human biases and stereotypes. In this work, we evaluated whether an interaction with a biased robotic object can increase participants' stereotypical thinking. In the study, a gender-biased robot moderated debates between two participants (man and woman) in three conditions: (1) The robot's behavior matched gender stereotypes (Pro-Man); (2) The robot's behavior countered gender stereotypes (Pro-Woman); (3) The robot's behavior did not reflect gender stereotypes and did not counter them (No-Preference). Quantitative and qualitative measures indicated that the interaction with the robot in the Pro-Man condition increased participants' stereotypical thinking. In the No-Preference condition, stereotypical thinking was also observed but to a lesser extent. In contrast, when the robot displayed counter-biased behavior in the Pro-Woman condition, stereotypical thinking was eliminated. Our findings suggest that HRI designers must be conscious of AI algorithmic biases, as interactions with biased robots can reinforce implicit stereotypical thinking and exacerbate existing biases in society. On the other hand, counter-biased robotic behavior can be leveraged to support present efforts to address the negative impact of stereotypical thinking.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"64 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82703349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Effects of Predictive Robot Eyes on Trust and Task Performance in an Industrial Cooperation Task 预测机器人眼对工业协作任务中信任和任务绩效的影响
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580123
L. Onnasch, Paul Schweidler, Maximilian Wieser
Industrial cobots can perform variable action sequences. For human-robot interaction (HRI) this can have detrimental effects, as the robot's actions can be difficult to predict. In human interaction, eye gaze intuitively directs attention and communicates subsequent actions. Whether this mechanism can benefit HRI, too, is not well understood. This study investigated the impact of anthropomorphic eyes as directional cues in robot design. 42 participants worked on two subsequent tasks in an embodied HRI with a Sawyer robot. The study used a between-subject design and presented either anthropomorphic eyes, arrows or a black screen as control condition on the robot's display. Results showed that neither directional stimuli nor the anthropomorphic design in particular led to increased trust. But anthropomorphic robot eyes improved the prediction speed, whereas this effect could not be found for non-anthropomorphic cues (arrows). Anthropomorphic eyes therefore seem to be better suitable for an implementation on an industrial robot.
工业协作机器人可以执行各种动作序列。对于人机交互(HRI)来说,这可能会产生有害的影响,因为机器人的行为很难预测。在人类互动中,眼睛的凝视直观地引导注意力并传达后续行动。这种机制是否也能使HRI受益,目前还不清楚。本研究探讨了拟人化眼睛作为机器人设计方向线索的影响。42名参与者与Sawyer机器人一起在嵌入式HRI中完成了两个后续任务。该研究采用了受试者之间的设计,并在机器人的显示器上展示拟人化的眼睛、箭头或黑屏作为控制条件。结果表明,定向刺激和拟人化设计都不能增加信任。但拟人化的机器人眼睛提高了预测速度,而非拟人化的线索(箭头)则没有这种效果。因此,拟人化的眼睛似乎更适合在工业机器人上实现。
{"title":"Effects of Predictive Robot Eyes on Trust and Task Performance in an Industrial Cooperation Task","authors":"L. Onnasch, Paul Schweidler, Maximilian Wieser","doi":"10.1145/3568294.3580123","DOIUrl":"https://doi.org/10.1145/3568294.3580123","url":null,"abstract":"Industrial cobots can perform variable action sequences. For human-robot interaction (HRI) this can have detrimental effects, as the robot's actions can be difficult to predict. In human interaction, eye gaze intuitively directs attention and communicates subsequent actions. Whether this mechanism can benefit HRI, too, is not well understood. This study investigated the impact of anthropomorphic eyes as directional cues in robot design. 42 participants worked on two subsequent tasks in an embodied HRI with a Sawyer robot. The study used a between-subject design and presented either anthropomorphic eyes, arrows or a black screen as control condition on the robot's display. Results showed that neither directional stimuli nor the anthropomorphic design in particular led to increased trust. But anthropomorphic robot eyes improved the prediction speed, whereas this effect could not be found for non-anthropomorphic cues (arrows). Anthropomorphic eyes therefore seem to be better suitable for an implementation on an industrial robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"1 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88185611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Reactive Planning for Coordinated Handover of an Autonomous Aerial Manipulator 自主航空机械臂协调交接的反应性规划
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580055
Jérôme Truc, D. Sidobre, R. Alami
In this paper, we present a coordinated and reactive human-aware motion planner for performing a handover task by an autonomous aerial manipulator (AAM). We present a method to determine the final state of the AAM for a handover task based on the current state of the human and the surrounding obstacles. We consider the visual field of the human and the effort to turn the head and see the AAM as well as the discomfort caused to the human. We apply these social constraints together with the kinematic constraints of the AAM to determine its coordinated motion along the trajectory.
在本文中,我们提出了一个协调和反应的人类感知运动规划器,用于执行自主空中机械臂(AAM)的切换任务。我们提出了一种基于人的当前状态和周围障碍物来确定交接任务的AAM最终状态的方法。我们考虑了人的视野和努力转过头来看到AAM以及对人造成的不适。我们将这些社会约束与AAM的运动学约束结合起来确定其沿轨迹的协调运动。
{"title":"Reactive Planning for Coordinated Handover of an Autonomous Aerial Manipulator","authors":"Jérôme Truc, D. Sidobre, R. Alami","doi":"10.1145/3568294.3580055","DOIUrl":"https://doi.org/10.1145/3568294.3580055","url":null,"abstract":"In this paper, we present a coordinated and reactive human-aware motion planner for performing a handover task by an autonomous aerial manipulator (AAM). We present a method to determine the final state of the AAM for a handover task based on the current state of the human and the surrounding obstacles. We consider the visual field of the human and the effort to turn the head and see the AAM as well as the discomfort caused to the human. We apply these social constraints together with the kinematic constraints of the AAM to determine its coordinated motion along the trajectory.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"68 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91306181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Towards Robot Learning from Spoken Language 从口语中学习机器人
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580053
K. Kodur, Manizheh Zand, Maria Kyrarini
The paper proposes a robot learning framework that empowers a robot to automatically generate a sequence of actions from unstructured spoken language. The robot learning framework was able to distinguish between instructions and unrelated conversations. Data were collected from 25 participants, who were asked to instruct the robot to perform a collaborative cooking task while being interrupted and distracted. The system was able to identify the sequence of instructed actions for a cooking task with an accuracy of of 92.85 ± 3.87%.
本文提出了一种机器人学习框架,使机器人能够从非结构化的口语中自动生成一系列动作。机器人学习框架能够区分指令和不相关的对话。研究人员从25名参与者那里收集了数据,他们被要求在被打断和分心的情况下指导机器人完成一项协作烹饪任务。该系统能够识别烹饪任务的指示动作序列,准确率为92.85±3.87%。
{"title":"Towards Robot Learning from Spoken Language","authors":"K. Kodur, Manizheh Zand, Maria Kyrarini","doi":"10.1145/3568294.3580053","DOIUrl":"https://doi.org/10.1145/3568294.3580053","url":null,"abstract":"The paper proposes a robot learning framework that empowers a robot to automatically generate a sequence of actions from unstructured spoken language. The robot learning framework was able to distinguish between instructions and unrelated conversations. Data were collected from 25 participants, who were asked to instruct the robot to perform a collaborative cooking task while being interrupted and distracted. The system was able to identify the sequence of instructed actions for a cooking task with an accuracy of of 92.85 ± 3.87%.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"4 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87606085","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Understanding Differences in Human-Robot Teaming Dynamics between Deaf/Hard of Hearing and Hearing Individuals 理解聋人/重听人与正常人之间人-机器人团队动态的差异
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580146
A'di Dust, Carola Gonzalez-Lebron, Shannon Connell, Saurav Singh, Reynold Bailey, Cecilia Ovesdotter Alm, Jamison Heard
With the development of industry 4.0, more collaborative robots are being implemented in manufacturing environments. Hence, research in human-robot interaction (HRI) and human-cobot interaction (HCI) is gaining traction. However, the design of how cobots interact with humans has typically focused on the general able-bodied population, and these interactions are sometimes ineffective for specific groups of users. This study's goal is to identify interactive differences between hearing and deaf and hard of hearing individuals when interacting with cobots. Understanding these differences may promote inclusiveness by detecting ineffective interactions, reasoning why an interaction failed, and adapting the framework's interaction strategy appropriately.
随着工业4.0的发展,更多的协作机器人正在制造环境中实施。因此,人机交互(HRI)和人机协作交互(HCI)的研究越来越受到关注。然而,协作机器人如何与人类互动的设计通常集中在一般健全的人群上,这些互动有时对特定的用户群体无效。这项研究的目的是确定听力正常、失聪和听力障碍个体在与协作机器人互动时的互动差异。了解这些差异可以通过检测无效的交互、推断交互失败的原因以及适当地调整框架的交互策略来促进包容性。
{"title":"Understanding Differences in Human-Robot Teaming Dynamics between Deaf/Hard of Hearing and Hearing Individuals","authors":"A'di Dust, Carola Gonzalez-Lebron, Shannon Connell, Saurav Singh, Reynold Bailey, Cecilia Ovesdotter Alm, Jamison Heard","doi":"10.1145/3568294.3580146","DOIUrl":"https://doi.org/10.1145/3568294.3580146","url":null,"abstract":"With the development of industry 4.0, more collaborative robots are being implemented in manufacturing environments. Hence, research in human-robot interaction (HRI) and human-cobot interaction (HCI) is gaining traction. However, the design of how cobots interact with humans has typically focused on the general able-bodied population, and these interactions are sometimes ineffective for specific groups of users. This study's goal is to identify interactive differences between hearing and deaf and hard of hearing individuals when interacting with cobots. Understanding these differences may promote inclusiveness by detecting ineffective interactions, reasoning why an interaction failed, and adapting the framework's interaction strategy appropriately.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"19 1 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88051400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Who to Teach a Robot to Facilitate Multi-party Social Interactions? 谁教机器人促进多方社会互动?
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580056
Jouh Yeong Chew, Keisuke Nakamura
One salient function of social robots is to play the role of facilitator to enhance the harmony state of multi-party social interactions so that every human participant is encouraged and motivated to engage actively. However, it is challenging to handcraft the behavior of social robots to achieve this objective. One promising approach is for the robot to learn from human teachers. This paper reports the findings of an empirical test to determine the optimal experiment condition for a robot to learn verbal and nonverbal strategies to facilitate a multi-party interaction. First, the modified L8 Orthogonal Array (OA) is used to design a fractional factorial experiment condition using factors like the type of human facilitator, group size and stimulus type. The response of OA is the harmony state explicitly defined using the speech turn-taking between speakers and represented using metrics extracted from the first order Markov transition matrix. Analyses of Main Effects and ANOVA suggest the type of human facilitator and group size are significant factors affecting the harmony state. Therefore, we propose the optimal experiment condition to train a facilitator robot using high school teachers as human teachers and group size larger than four participants.
社交机器人的一个突出功能是扮演促进者的角色,增强多方社会互动的和谐状态,从而鼓励和激励每个人类参与者积极参与。然而,手工制作社交机器人的行为来实现这一目标是具有挑战性的。一种很有希望的方法是让机器人向人类老师学习。本文报告了一项实证测试的结果,以确定机器人学习语言和非语言策略以促进多方互动的最佳实验条件。首先,利用改进的L8正交阵列(OA)设计分数析因实验条件,考虑人的引导者类型、群体规模和刺激类型等因素。OA的响应是使用说话者之间的语音轮流显式定义的和谐状态,并使用从一阶马尔可夫转移矩阵中提取的度量来表示。主效应分析和方差分析表明,协调人类型和团队规模是影响和谐状态的重要因素。因此,我们提出了以高中教师为真人教师,团队规模大于4人,训练引导员机器人的最佳实验条件。
{"title":"Who to Teach a Robot to Facilitate Multi-party Social Interactions?","authors":"Jouh Yeong Chew, Keisuke Nakamura","doi":"10.1145/3568294.3580056","DOIUrl":"https://doi.org/10.1145/3568294.3580056","url":null,"abstract":"One salient function of social robots is to play the role of facilitator to enhance the harmony state of multi-party social interactions so that every human participant is encouraged and motivated to engage actively. However, it is challenging to handcraft the behavior of social robots to achieve this objective. One promising approach is for the robot to learn from human teachers. This paper reports the findings of an empirical test to determine the optimal experiment condition for a robot to learn verbal and nonverbal strategies to facilitate a multi-party interaction. First, the modified L8 Orthogonal Array (OA) is used to design a fractional factorial experiment condition using factors like the type of human facilitator, group size and stimulus type. The response of OA is the harmony state explicitly defined using the speech turn-taking between speakers and represented using metrics extracted from the first order Markov transition matrix. Analyses of Main Effects and ANOVA suggest the type of human facilitator and group size are significant factors affecting the harmony state. Therefore, we propose the optimal experiment condition to train a facilitator robot using high school teachers as human teachers and group size larger than four participants.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"7 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89012519","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Perception-Intention-Action Cycle as a Human Acceptable Way for Improving Human-Robot Collaborative Tasks 感知-意图-行动循环是改善人机协作任务的人类可接受方式
IF 5.1 Q2 ROBOTICS Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580149
J. E. Domínguez-Vidal, Nicolás Rodríguez, A. Sanfeliu
In Human-Robot Collaboration (HRC) tasks, the classical Perception-Action cycle can not fully explain the collaborative behaviour of the human-robot pair until it is extended to Perception-Intention-Action (PIA) cycle, giving to the human's intention a key role at the same level of the robot's perception and not as a subblock of this. Although part of the human's intention can be perceived or inferred by the other agent, this is prone to misunderstandings so the true intention has to be explicitly informed in some cases to fulfill the task. Here, we explore both types of intention and we combine them with the robot's perception through the concept of Situation Awareness (SA). We validate the PIA cycle and its acceptance by the user with a preliminary experiment in an object transportation task showing that its usage can increase trust in the robot.
在人机协作(HRC)任务中,经典的感知-行动循环不能完全解释人机对的协作行为,直到将其扩展到感知-意图-行动(PIA)循环,使人的意图在机器人感知的同一层面上发挥关键作用,而不是作为其中的子块。尽管人类的部分意图可以被其他智能体感知或推断,但这很容易产生误解,因此在某些情况下,为了完成任务,必须明确告知真实意图。在这里,我们探讨了这两种类型的意图,并通过情境感知(SA)的概念将它们与机器人的感知结合起来。我们通过一个物体运输任务的初步实验验证了PIA周期及其被用户接受的程度,表明它的使用可以增加对机器人的信任。
{"title":"Perception-Intention-Action Cycle as a Human Acceptable Way for Improving Human-Robot Collaborative Tasks","authors":"J. E. Domínguez-Vidal, Nicolás Rodríguez, A. Sanfeliu","doi":"10.1145/3568294.3580149","DOIUrl":"https://doi.org/10.1145/3568294.3580149","url":null,"abstract":"In Human-Robot Collaboration (HRC) tasks, the classical Perception-Action cycle can not fully explain the collaborative behaviour of the human-robot pair until it is extended to Perception-Intention-Action (PIA) cycle, giving to the human's intention a key role at the same level of the robot's perception and not as a subblock of this. Although part of the human's intention can be perceived or inferred by the other agent, this is prone to misunderstandings so the true intention has to be explicitly informed in some cases to fulfill the task. Here, we explore both types of intention and we combine them with the robot's perception through the concept of Situation Awareness (SA). We validate the PIA cycle and its acceptance by the user with a preliminary experiment in an object transportation task showing that its usage can increase trust in the robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"8 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89696048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
ACM Transactions on Human-Robot Interaction
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1