Pub Date : 2024-07-08DOI: 10.1007/s12369-024-01157-7
Miyoung Cho, Dohyung Kim, Minsu Jang, Jaeyeon Lee, Jaehong Kim, Woo-han Yun, Youngwoo Yoon, Jinhyeok Jang, Chankyu Park, Woo-Ri Ko, Jaeyoon Jang, Ho-Sub Yoon, Daeha Lee, Choulsoo Jang
The increase in elderly population is emerging as a serious social issue. The coronavirus pandemic has increased the number of elderly people suffering from depression and loneliness owing to the lack of face-to-face activities. In this study, we developed an integrated system for the human-care robot service, considering cognitive and emotional support for elderly people, and verified its stability and usefulness in the real world. We recruited 40 elderly people for an apartment testbed environment experiment and two elderly people living alone for a long time participated in the experiment at their homes. Quantitative experimental results were analyzed by comparing service success rates and user satisfaction in two different test environments to verify the stability of the service. Qualitative evaluations were also conducted through surveys and interviews to assess the usefulness of the service.
{"title":"Evaluating Human-Care Robot Services for the Elderly: An Experimental Study","authors":"Miyoung Cho, Dohyung Kim, Minsu Jang, Jaeyeon Lee, Jaehong Kim, Woo-han Yun, Youngwoo Yoon, Jinhyeok Jang, Chankyu Park, Woo-Ri Ko, Jaeyoon Jang, Ho-Sub Yoon, Daeha Lee, Choulsoo Jang","doi":"10.1007/s12369-024-01157-7","DOIUrl":"https://doi.org/10.1007/s12369-024-01157-7","url":null,"abstract":"<p>The increase in elderly population is emerging as a serious social issue. The coronavirus pandemic has increased the number of elderly people suffering from depression and loneliness owing to the lack of face-to-face activities. In this study, we developed an integrated system for the human-care robot service, considering cognitive and emotional support for elderly people, and verified its stability and usefulness in the real world. We recruited 40 elderly people for an apartment testbed environment experiment and two elderly people living alone for a long time participated in the experiment at their homes. Quantitative experimental results were analyzed by comparing service success rates and user satisfaction in two different test environments to verify the stability of the service. Qualitative evaluations were also conducted through surveys and interviews to assess the usefulness of the service.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141571351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-28DOI: 10.1007/s12369-024-01152-y
Beomyeong Park, Donghyeon Kim, Daegyu Lim, Suhan Park, Junewhee Ahn, Seungyeon Kim, Jaeyong Shin, Eunho Sung, Jaehoon Sim, Junhyung Kim, Myeong-Ju Kim, Junhyeok Cha, Gyeongjae Park, Hokyun Lee, Seungbin You, Keunwoo Jang, Seung-Hun Kim, Mathew Schwartz, Jaeheung Park
Avatar robots enable the teleoperation and telepresence of an operator with a rich and meaningful sense of existence in another location. Robotic avatar systems rely on intuitive interactions to afford operators comfortable and accurate robot control to perform various tasks. The ability of operators to feel immersed within a robot has drawn interest in multiple research fields to explore the future capabilities of such systems. This paper presents a robotic avatar system based on a custom humanoid robot, TOCABI, with a mobile base. Its teleoperation system was developed in response to the ANA Avatar XPRIZE. Combining the life-size humanoid robot and the mobile base allows for improved mobility and dexterous manipulation. The robotic avatar system comprises the robot/base and an operator station that incorporates haptic feedback devices, trackers, a head-mounted display, gloves, and pedals. These devices connect the robot-environment interaction and operator-avatar robot experience through visual, auditory, tactile, haptic, and kinesthetic feedback. Combining the untethered battery-operated and Wi-Fi-enabled robot with these sensory experiences enables intuitive control through the operator’s body movement. The performance of the robotic avatar system was evaluated through user studies and demonstrated in the ANA Avatar XPRIZE Finals, represented by Team SNU, where it completed 8 of the 10 missions, placing the team eighth among the 17 finalists.
{"title":"Intuitive and Interactive Robotic Avatar System for Tele-Existence: TEAM SNU in the ANA Avatar XPRIZE Finals","authors":"Beomyeong Park, Donghyeon Kim, Daegyu Lim, Suhan Park, Junewhee Ahn, Seungyeon Kim, Jaeyong Shin, Eunho Sung, Jaehoon Sim, Junhyung Kim, Myeong-Ju Kim, Junhyeok Cha, Gyeongjae Park, Hokyun Lee, Seungbin You, Keunwoo Jang, Seung-Hun Kim, Mathew Schwartz, Jaeheung Park","doi":"10.1007/s12369-024-01152-y","DOIUrl":"https://doi.org/10.1007/s12369-024-01152-y","url":null,"abstract":"<p>Avatar robots enable the teleoperation and telepresence of an operator with a rich and meaningful sense of existence in another location. Robotic avatar systems rely on intuitive interactions to afford operators comfortable and accurate robot control to perform various tasks. The ability of operators to feel immersed within a robot has drawn interest in multiple research fields to explore the future capabilities of such systems. This paper presents a robotic avatar system based on a custom humanoid robot, TOCABI, with a mobile base. Its teleoperation system was developed in response to the ANA Avatar XPRIZE. Combining the life-size humanoid robot and the mobile base allows for improved mobility and dexterous manipulation. The robotic avatar system comprises the robot/base and an operator station that incorporates haptic feedback devices, trackers, a head-mounted display, gloves, and pedals. These devices connect the robot-environment interaction and operator-avatar robot experience through visual, auditory, tactile, haptic, and kinesthetic feedback. Combining the untethered battery-operated and Wi-Fi-enabled robot with these sensory experiences enables intuitive control through the operator’s body movement. The performance of the robotic avatar system was evaluated through user studies and demonstrated in the ANA Avatar XPRIZE Finals, represented by Team SNU, where it completed 8 of the 10 missions, placing the team eighth among the 17 finalists.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141551058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-27DOI: 10.1007/s12369-024-01156-8
Erika Lutin, Shirley A. Elprama, Jan Cornelis, Patricia Leconte, Bart Van Doninck, Maarten Witters, Walter De Raedt, An Jacobs
Currently, collaborative robots (cobots) are mostly programmed to do one task repetitively. They can be programmed at different speeds and work near human operators. The goal of our research was to investigate the effect of robot speed on acceptance, subjective and objective stress, and cognitive workload of individuals. Therefore, we organized a repeated measures experiment in which participants (N = 25) conducted an assembly task with the YuMi cobot from ABB at a low and at a high speed. Subjective and physiological responses were collected, and participants were subjected to a standardized stress test. Our results indicate that when working with a cobot at a high speed, people believe they can work faster and be more productive but also experience a higher workload and higher perceived stress. We also found that tonic EDA is a significant physiological predictor for monitoring perceived stress in humans. We observed a greater relative increase in tonic EDA from baseline to task execution during high-speed mode compared to low-speed mode. Additionally, this increase in tonic EDA significantly correlated with participants’ perceived stress levels. However, workload could not be predicted by any of the physiological measures. Future research should explore the effect of higher cobot working speeds and the use of physiological measures (such as stress) as input to guide the collaboration between individuals and cobots.
{"title":"Pilot Study on the Relationship Between Acceptance of Collaborative Robots and Stress","authors":"Erika Lutin, Shirley A. Elprama, Jan Cornelis, Patricia Leconte, Bart Van Doninck, Maarten Witters, Walter De Raedt, An Jacobs","doi":"10.1007/s12369-024-01156-8","DOIUrl":"https://doi.org/10.1007/s12369-024-01156-8","url":null,"abstract":"<p>Currently, collaborative robots (cobots) are mostly programmed to do one task repetitively. They can be programmed at different speeds and work near human operators. The goal of our research was to investigate the effect of robot speed on acceptance, subjective and objective stress, and cognitive workload of individuals. Therefore, we organized a repeated measures experiment in which participants (<i>N</i> = 25) conducted an assembly task with the YuMi cobot from ABB at a low and at a high speed. Subjective and physiological responses were collected, and participants were subjected to a standardized stress test. Our results indicate that when working with a cobot at a high speed, people believe they can work faster and be more productive but also experience a higher workload and higher perceived stress. We also found that tonic EDA is a significant physiological predictor for monitoring perceived stress in humans. We observed a greater relative increase in tonic EDA from baseline to task execution during high-speed mode compared to low-speed mode. Additionally, this increase in tonic EDA significantly correlated with participants’ perceived stress levels. However, workload could not be predicted by any of the physiological measures. Future research should explore the effect of higher cobot working speeds and the use of physiological measures (such as stress) as input to guide the collaboration between individuals and cobots.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141551059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-21DOI: 10.1007/s12369-024-01149-7
Junya Nakanishi, Jun Baba, Wei-Chuan Chang, Aya Nakae, Hidenobu Sumioka, Hiroshi Ishiguro
Intergenerational interactions between children and older adults are gaining broader recognition because of their mutual benefits. However, such interactions face practical limitations owing to potential disease transmission and the poor health of older adults for face-to-face interactions. This study explores robot-mediated interactions as a potential solution to address these issues. In this study, older adults remotely controlled a social robot to perform a health-screening task for nursery school children, thereby overcoming the problems associated with face-to-face interactions while engaging in physical interactions. The results of this study suggested that the children responded favorably to the robot, and the rate of positive response increased over time. Older adults also found the task generally manageable and experienced a significant positive shift in their attitude toward children. These findings suggest that robot-mediated interactions can effectively facilitate intergenerational engagement and provide psychosocial benefits to both the parties to the engagement. This study provides valuable insights into the potential of robot-mediated interactions in childcare and other similar settings.
{"title":"Robot-Mediated Intergenerational Childcare: Experimental Study Based on Health-Screening Task in Nursery School","authors":"Junya Nakanishi, Jun Baba, Wei-Chuan Chang, Aya Nakae, Hidenobu Sumioka, Hiroshi Ishiguro","doi":"10.1007/s12369-024-01149-7","DOIUrl":"https://doi.org/10.1007/s12369-024-01149-7","url":null,"abstract":"<p>Intergenerational interactions between children and older adults are gaining broader recognition because of their mutual benefits. However, such interactions face practical limitations owing to potential disease transmission and the poor health of older adults for face-to-face interactions. This study explores robot-mediated interactions as a potential solution to address these issues. In this study, older adults remotely controlled a social robot to perform a health-screening task for nursery school children, thereby overcoming the problems associated with face-to-face interactions while engaging in physical interactions. The results of this study suggested that the children responded favorably to the robot, and the rate of positive response increased over time. Older adults also found the task generally manageable and experienced a significant positive shift in their attitude toward children. These findings suggest that robot-mediated interactions can effectively facilitate intergenerational engagement and provide psychosocial benefits to both the parties to the engagement. This study provides valuable insights into the potential of robot-mediated interactions in childcare and other similar settings.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141551057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-18DOI: 10.1007/s12369-024-01155-9
Rinaldo Kühne, Jochen Peter, Chiara de Jong, Alex Barco
Research on children’s anthropomorphism of social robots is mostly cross-sectional and based on a single measurement. However, because social robots are new type of technology with which children have little experience, children’s initial responses to social robots may be biased by a novelty effect. Accordingly, a single measurement of anthropomorphism may not accurately reflect how children anthropomorphize social robots over time. Thus, we used data from a six-wave panel study to investigate longitudinal changes in 8- to 9-year-old children’s anthropomorphism of a social robot. Latent class growth analyses revealed that anthropomorphism peaked after the first interaction with the social robot, remained stable for a brief period of time, and then decreased. Moreover, two distinct longitudinal trajectories of anthropomorphism could be identified: one with moderate to high anthropomorphism and one with low to moderate anthropomorphism. Previous media exposure to non-fictional robots increased the probability that children experienced higher levels of anthropomorphism.
{"title":"How Does Children’s Anthropomorphism of a Social Robot Develop Over Time? A Six-Wave Panel Study","authors":"Rinaldo Kühne, Jochen Peter, Chiara de Jong, Alex Barco","doi":"10.1007/s12369-024-01155-9","DOIUrl":"https://doi.org/10.1007/s12369-024-01155-9","url":null,"abstract":"<p>Research on children’s anthropomorphism of social robots is mostly cross-sectional and based on a single measurement. However, because social robots are new type of technology with which children have little experience, children’s initial responses to social robots may be biased by a novelty effect. Accordingly, a single measurement of anthropomorphism may not accurately reflect how children anthropomorphize social robots over time. Thus, we used data from a six-wave panel study to investigate longitudinal changes in 8- to 9-year-old children’s anthropomorphism of a social robot. Latent class growth analyses revealed that anthropomorphism peaked after the first interaction with the social robot, remained stable for a brief period of time, and then decreased. Moreover, two distinct longitudinal trajectories of anthropomorphism could be identified: one with moderate to high anthropomorphism and one with low to moderate anthropomorphism. Previous media exposure to non-fictional robots increased the probability that children experienced higher levels of anthropomorphism.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141551060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In recent years, the research of humanoid robots that can change users’ opinions has been conducted extensively. In particular, two robots have been found to be able to improve their persuasiveness by cooperating with each other in a sophisticated manner. Previous studies have evaluated the changes in opinions when robots showed consensus building. However, users did not participate in the conversations, and the optimal strategy may change depending on their prior opinions. Therefore, in this study, we developed a system that adaptively changes conversations between robots based on user opinions. We investigate the effect on the change in opinions when the discussion converges to the same position as the user and when it converges to a different position. We conducted two subject experiments in which a user and virtual robotic agents talked to each other using buttons in a crowded setting. The results showed that users with confidence in their opinions increased their confidence when the robot agents’ opinions converged to the same position and decreased their confidence when the robot agents’ opinions converged to a different position. This will significantly contribute to persuasion research using multiple robots and the development of advanced dialogue coordination between robots.
{"title":"Effects of Demonstrating Consensus Between Robots to Change User’s Opinion","authors":"Kazuki Sakai, Koh Mitsuda, Yuichiro Yoshikawa, Ryuichiro Higashinaka, Takashi Minato, Hiroshi Ishiguro","doi":"10.1007/s12369-024-01151-z","DOIUrl":"https://doi.org/10.1007/s12369-024-01151-z","url":null,"abstract":"<p>In recent years, the research of humanoid robots that can change users’ opinions has been conducted extensively. In particular, two robots have been found to be able to improve their persuasiveness by cooperating with each other in a sophisticated manner. Previous studies have evaluated the changes in opinions when robots showed consensus building. However, users did not participate in the conversations, and the optimal strategy may change depending on their prior opinions. Therefore, in this study, we developed a system that adaptively changes conversations between robots based on user opinions. We investigate the effect on the change in opinions when the discussion converges to the same position as the user and when it converges to a different position. We conducted two subject experiments in which a user and virtual robotic agents talked to each other using buttons in a crowded setting. The results showed that users with confidence in their opinions increased their confidence when the robot agents’ opinions converged to the same position and decreased their confidence when the robot agents’ opinions converged to a different position. This will significantly contribute to persuasion research using multiple robots and the development of advanced dialogue coordination between robots.\u0000</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141551061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-14DOI: 10.1007/s12369-024-01142-0
Changzeng Fu, Meneses Alexis, Y. Yoshikawa, Hiroshi Ishiguro
{"title":"Enhancing the Mobile Humanoid Robot’s Emotional Expression with Affective Vertical-Oscillations","authors":"Changzeng Fu, Meneses Alexis, Y. Yoshikawa, Hiroshi Ishiguro","doi":"10.1007/s12369-024-01142-0","DOIUrl":"https://doi.org/10.1007/s12369-024-01142-0","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141339562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-13DOI: 10.1007/s12369-024-01137-x
N. I. Abbasi, Micol Spitale, Joanna Anderson, Tamsin Ford, Peter B. Jones, Hatice Gunes
{"title":"Analysing Children’s Responses from Multiple Modalities During Robot-Assisted Assessment of Mental Wellbeing","authors":"N. I. Abbasi, Micol Spitale, Joanna Anderson, Tamsin Ford, Peter B. Jones, Hatice Gunes","doi":"10.1007/s12369-024-01137-x","DOIUrl":"https://doi.org/10.1007/s12369-024-01137-x","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141347370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-06DOI: 10.1007/s12369-024-01144-y
Burak Sisman, J. Steinrücke, Ton de Jong
{"title":"Does giving students feedback on their concept maps through an on-screen avatar or a humanoid robot make a difference?","authors":"Burak Sisman, J. Steinrücke, Ton de Jong","doi":"10.1007/s12369-024-01144-y","DOIUrl":"https://doi.org/10.1007/s12369-024-01144-y","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141376718","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-05DOI: 10.1007/s12369-024-01148-8
Lennart Wachowiak, Andrew Coles, Gerard Canal, Oya Celiktutan
In recent years, explanations have become a pressing matter in AI research. This development was caused by the increased use of black-box models and a realization of the importance of trustworthy AI. In particular, explanations are necessary for human–agent interactions to ensure that the user can trust the agent and that collaborations are effective. Human–agent interactions are complex social scenarios involving a user, an autonomous agent, and an environment or task with its own distinct properties. Thus, such interactions require a wide variety of explanations, which are not covered by the methods of a single AI discipline, such as computer vision or natural language processing. In this paper, we map out what types of explanations are important for human–agent interactions, surveying the field via a scoping review. In addition to the typical introspective explanation tackled by explainability researchers, we look at assistive explanations, aiming to support the user with their task. Secondly, we survey what causes the need for an explanation in the first place. We identify a variety of human–agent interaction-specific causes and categorize them by whether they are centered on the agent’s behavior, the user’s mental state, or an external entity. Our overview aims to guide robotics practitioners in designing agents with more comprehensive explanation-related capacities, considering different explanation types and the concrete times when explanations should be given.
{"title":"A Taxonomy of Explanation Types and Need Indicators in Human–Agent Collaborations","authors":"Lennart Wachowiak, Andrew Coles, Gerard Canal, Oya Celiktutan","doi":"10.1007/s12369-024-01148-8","DOIUrl":"https://doi.org/10.1007/s12369-024-01148-8","url":null,"abstract":"<p>In recent years, explanations have become a pressing matter in AI research. This development was caused by the increased use of black-box models and a realization of the importance of trustworthy AI. In particular, explanations are necessary for human–agent interactions to ensure that the user can trust the agent and that collaborations are effective. Human–agent interactions are complex social scenarios involving a user, an autonomous agent, and an environment or task with its own distinct properties. Thus, such interactions require a wide variety of explanations, which are not covered by the methods of a single AI discipline, such as computer vision or natural language processing. In this paper, we map out what types of explanations are important for human–agent interactions, surveying the field via a scoping review. In addition to the typical introspective explanation tackled by explainability researchers, we look at assistive explanations, aiming to support the user with their task. Secondly, we survey what causes the need for an explanation in the first place. We identify a variety of human–agent interaction-specific causes and categorize them by whether they are centered on the agent’s behavior, the user’s mental state, or an external entity. Our overview aims to guide robotics practitioners in designing agents with more comprehensive explanation-related capacities, considering different explanation types and the concrete times when explanations should be given.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":null,"pages":null},"PeriodicalIF":4.7,"publicationDate":"2024-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141255754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}