Pub Date : 2024-03-06DOI: 10.1007/s12369-024-01111-7
Abstract
Numerous studies have investigated proxemics in the context of human–robot interactions, but little is known about whether these insights can be applied to human–drone interactions (HDI). As drones become more common in social settings, it is crucial to ensure they navigate in a socially acceptable and human-friendly way. Understanding how individuals position themselves around drones is vital to promote user well-being and drones’ social acceptance. However, real-world constraints and risks associated with drones flying in close proximity to participants have limited research in this field. Virtual reality is a promising alternative for investigating HDI, as prior research suggests. This paper presents a proxemic user study (N = 45) in virtual reality, examining how drone height and framing influence participants’ proxemic preferences. The study also explores participants’ perceptions of social drones and their vision for the future of flying robots. Our findings show that drone height significantly impacts participants’ preferred interpersonal distance, while framing had no significant effect. Thoughts on how participants envision social drones (e.g., interaction, design, applications) reveal interpersonal differences but also shows overall consistency over time. While the study demonstrates the value of using virtual reality for HDI experiments, further research is necessary to determine the generalizability of our findings to real-world HDI scenarios.
{"title":"Co-existing with Drones: A Virtual Exploration of Proxemic Behaviours and Users’ Insights on Social Drones","authors":"","doi":"10.1007/s12369-024-01111-7","DOIUrl":"https://doi.org/10.1007/s12369-024-01111-7","url":null,"abstract":"<h3>Abstract</h3> <p>Numerous studies have investigated proxemics in the context of human–robot interactions, but little is known about whether these insights can be applied to human–drone interactions (HDI). As drones become more common in social settings, it is crucial to ensure they navigate in a socially acceptable and human-friendly way. Understanding how individuals position themselves around drones is vital to promote user well-being and drones’ social acceptance. However, real-world constraints and risks associated with drones flying in close proximity to participants have limited research in this field. Virtual reality is a promising alternative for investigating HDI, as prior research suggests. This paper presents a proxemic user study (N = 45) in virtual reality, examining how drone height and framing influence participants’ proxemic preferences. The study also explores participants’ perceptions of social drones and their vision for the future of flying robots. Our findings show that drone height significantly impacts participants’ preferred interpersonal distance, while framing had no significant effect. Thoughts on how participants envision social drones (e.g., interaction, design, applications) reveal interpersonal differences but also shows overall consistency over time. While the study demonstrates the value of using virtual reality for HDI experiments, further research is necessary to determine the generalizability of our findings to real-world HDI scenarios.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"41 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140047529","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-04DOI: 10.1007/s12369-024-01113-5
Victor Galvez, Esther Hanono
Although most studies that seek to measure participants’ judgments and attitudes regarding humanoid robots’ possessing (or appearing to possess) a mind or mental capacities have been based on verbal measures, there is yet no standard psychometric instrument for this end. Using a COSMIN approach, this critical review seeks to summarize the most valid and reliable self-report instruments that aim to measure mental state attribution to humanoid robots. 501 papers were reviewed, but only 11 were included, finding that: (1) The instruments do not usually measure mental state attribution toward robots as an exclusive phenomenon but as a factor associated with the tendency to anthropomorphize non-human entities; (2) There is a lack of consensus regarding a definition of mental state attribution and the psychometric dimensions that underlie it; (3) The tendency to anthropomorphize does not by itself imply the attribution of mind to robots. In our discussion, we delve into the general problem of mind perception/attribution and speculate on the possible theoretical basis for a multifactorial model for measuring mind perception as part of a broader phenomenon we term “psycheidolia.”
{"title":"What Does it Mean to Measure Mind Perception toward Robots? A Critical Review of the Main Self-Report Instruments","authors":"Victor Galvez, Esther Hanono","doi":"10.1007/s12369-024-01113-5","DOIUrl":"https://doi.org/10.1007/s12369-024-01113-5","url":null,"abstract":"<p>Although most studies that seek to measure participants’ judgments and attitudes regarding humanoid robots’ possessing (or appearing to possess) a mind or mental capacities have been based on verbal measures, there is yet no standard psychometric instrument for this end. Using a COSMIN approach, this critical review seeks to summarize the most valid and reliable self-report instruments that aim to measure mental state attribution to humanoid robots. 501 papers were reviewed, but only 11 were included, finding that: (1) The instruments do not usually measure mental state attribution toward robots as an exclusive phenomenon but as a factor associated with the tendency to anthropomorphize non-human entities; (2) There is a lack of consensus regarding a definition of mental state attribution and the psychometric dimensions that underlie it; (3) The tendency to anthropomorphize does not by itself imply the attribution of mind to robots. In our discussion, we delve into the general problem of mind perception/attribution and speculate on the possible theoretical basis for a multifactorial model for measuring mind perception as part of a broader phenomenon we term “psycheidolia.”</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"7 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140032515","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-01DOI: 10.1007/s12369-023-01077-y
Abstract
The design of robots for everyday use should take into account the specific nature of the individual end-user and the possibility of interactions with multiple users in diverse scenarios, promoting versatility and increasing the chances of their successful adoption in everyday environments. Most robots are designed, however, to perform tasks and interact in typical social scenarios with an abstract human user. We observed a recent surge in the use of accessories with social robots, which aligns with a broader trend of consumers’ preference for personalizing the technologies they interact with. Drawing from the concepts of adaptability and customizability in collaborative systems, we explore the potential use of accessory-like items for social robots to enable low-tech customization and user appropriation, thus enhancing their value and suitability in various social situations. We draw from Human-Computer Interaction and Computer-Supported Co-operative Work literature to show how end-user customizability and appropriation are essential, but less frequently considered, in the study and design of social robots. We conceptualize Social Robot Accessories (SRAs) as a way for end-users to customize robots, and present three studies - (1) a literature survey on accessory-like item use with social robots, (2) a survey of commercially available robot accessories, and (3) a Twitter-based analysis of accessory use for AIBO and LOVOT robots by their users. We use findings from these studies to envision a design space of SRAs for use by Human-Robot Interaction (HRI) researchers.
{"title":"Social robot accessories for tailoring and appropriation of social robots","authors":"","doi":"10.1007/s12369-023-01077-y","DOIUrl":"https://doi.org/10.1007/s12369-023-01077-y","url":null,"abstract":"<h3>Abstract</h3> <p>The design of robots for everyday use should take into account the specific nature of the individual end-user and the possibility of interactions with multiple users in diverse scenarios, promoting versatility and increasing the chances of their successful adoption in everyday environments. Most robots are designed, however, to perform tasks and interact in typical social scenarios with an abstract human user. We observed a recent surge in the use of accessories with social robots, which aligns with a broader trend of consumers’ preference for personalizing the technologies they interact with. Drawing from the concepts of adaptability and customizability in collaborative systems, we explore the potential use of accessory-like items for social robots to enable low-tech customization and user appropriation, thus enhancing their value and suitability in various social situations. We draw from Human-Computer Interaction and Computer-Supported Co-operative Work literature to show how end-user customizability and appropriation are essential, but less frequently considered, in the study and design of social robots. We conceptualize Social Robot Accessories (SRAs) as a way for end-users to customize robots, and present three studies - (1) a literature survey on accessory-like item use with social robots, (2) a survey of commercially available robot accessories, and (3) a Twitter-based analysis of accessory use for AIBO and LOVOT robots by their users. We use findings from these studies to envision a design space of SRAs for use by Human-Robot Interaction (HRI) researchers.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"13 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140011644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-29DOI: 10.1007/s12369-024-01107-3
Julia Rosén, Jessica Lindblom, Maurice Lamb, Erik Billing
The human–robot interaction (HRI) field goes beyond the mere technical aspects of developing robots, often investigating how humans perceive robots. Human perceptions and behavior are determined, in part, by expectations. Given the impact of expectations on behavior, it is important to understand what expectations individuals bring into HRI settings and how those expectations may affect their interactions with the robot over time. For many people, social robots are not a common part of their experiences, thus any expectations they have of social robots are likely shaped by other sources. As a result, individual expectations coming into HRI settings may be highly variable. Although there has been some recent interest in expectations within the field, there is an overall lack of empirical investigation into its impacts on HRI, especially in-person robot interactions. To this end, a within-subject in-person study ((N=31)) was performed where participants were instructed to engage in open conversation with the social robot Pepper during two 2.5 min sessions. The robot was equipped with a custom dialogue system based on the GPT-3 large language model, allowing autonomous responses to verbal input. Participants’ affective changes towards the robot were assessed using three questionnaires, NARS, RAS, commonly used in HRI studies, and Closeness, based on the IOS scale. In addition to the three standard questionnaires, a custom question was administered to capture participants’ views on robot capabilities. All measures were collected three times, before the interaction with the robot, after the first interaction with the robot, and after the second interaction with the robot. Results revealed that participants to large degrees stayed with the expectations they had coming into the study, and in contrast to our hypothesis, none of the measured scales moved towards a common mean. Moreover, previous experience with robots was revealed to be a major factor of how participants experienced the robot in the study. These results could be interpreted as implying that expectations of robots are to large degrees decided before interactions with the robot, and that these expectations do not necessarily change as a result of the interaction. Results reveal a strong connection to how expectations are studied in social psychology and human-human interaction, underpinning its relevance for HRI research.
{"title":"Previous Experience Matters: An in-Person Investigation of Expectations in Human–Robot Interaction","authors":"Julia Rosén, Jessica Lindblom, Maurice Lamb, Erik Billing","doi":"10.1007/s12369-024-01107-3","DOIUrl":"https://doi.org/10.1007/s12369-024-01107-3","url":null,"abstract":"<p>The human–robot interaction (HRI) field goes beyond the mere technical aspects of developing robots, often investigating how humans perceive robots. Human perceptions and behavior are determined, in part, by expectations. Given the impact of expectations on behavior, it is important to understand what expectations individuals bring into HRI settings and how those expectations may affect their interactions with the robot over time. For many people, social robots are not a common part of their experiences, thus any expectations they have of social robots are likely shaped by other sources. As a result, individual expectations coming into HRI settings may be highly variable. Although there has been some recent interest in expectations within the field, there is an overall lack of empirical investigation into its impacts on HRI, especially in-person robot interactions. To this end, a within-subject in-person study (<span>(N=31)</span>) was performed where participants were instructed to engage in open conversation with the social robot Pepper during two 2.5 min sessions. The robot was equipped with a custom dialogue system based on the GPT-3 large language model, allowing autonomous responses to verbal input. Participants’ affective changes towards the robot were assessed using three questionnaires, NARS, RAS, commonly used in HRI studies, and Closeness, based on the IOS scale. In addition to the three standard questionnaires, a custom question was administered to capture participants’ views on robot capabilities. All measures were collected three times, before the interaction with the robot, after the first interaction with the robot, and after the second interaction with the robot. Results revealed that participants to large degrees stayed with the expectations they had coming into the study, and in contrast to our hypothesis, none of the measured scales moved towards a common mean. Moreover, previous experience with robots was revealed to be a major factor of how participants experienced the robot in the study. These results could be interpreted as implying that expectations of robots are to large degrees decided before interactions with the robot, and that these expectations do not necessarily change as a result of the interaction. Results reveal a strong connection to how expectations are studied in social psychology and human-human interaction, underpinning its relevance for HRI research.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"6 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-02-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140002516","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-25DOI: 10.1007/s12369-024-01102-8
Suzanne Janssen, Bob R. Schadenberg
This conceptual paper presents a novel framework for the design and study of social robots that support well-being. Building upon the self-determination theory and the associated Motivation, Engagement, and Thriving in User Experience (METUX) model, this paper argues that users’ psychological basic needs for autonomy, competence, and relatedness should be put at the center of social robot design. These basic needs are essential to people’s psychological well-being, engagement, and self-motivation. However, current literature offers limited insights into how human–robot interactions are related to users’ experiences of the satisfaction of their basic psychological needs and thus, to their well-being and flourishing. We propose that a need-fulfillment perspective could be an inspiring lens for the design of social robots, including socially assistive robots. We conceptualize various ways in which a psychological need-fulfillment perspective may be incorporated into future human–robot interaction research and design, ranging from the interface level to the specific tasks performed by a robot or the user’s behavior supported by the robot. The paper discusses the implications of the framework for designing social robots that promote well-being, as well as the implications for future research.
{"title":"A Psychological Need-Fulfillment Perspective for Designing Social Robots that Support Well-Being","authors":"Suzanne Janssen, Bob R. Schadenberg","doi":"10.1007/s12369-024-01102-8","DOIUrl":"https://doi.org/10.1007/s12369-024-01102-8","url":null,"abstract":"<p>This conceptual paper presents a novel framework for the design and study of social robots that support well-being. Building upon the self-determination theory and the associated Motivation, Engagement, and Thriving in User Experience (METUX) model, this paper argues that users’ psychological basic needs for autonomy, competence, and relatedness should be put at the center of social robot design. These basic needs are essential to people’s psychological well-being, engagement, and self-motivation. However, current literature offers limited insights into how human–robot interactions are related to users’ experiences of the satisfaction of their basic psychological needs and thus, to their well-being and flourishing. We propose that a need-fulfillment perspective could be an inspiring lens for the design of social robots, including socially assistive robots. We conceptualize various ways in which a psychological need-fulfillment perspective may be incorporated into future human–robot interaction research and design, ranging from the interface level to the specific tasks performed by a robot or the user’s behavior supported by the robot. The paper discusses the implications of the framework for designing social robots that promote well-being, as well as the implications for future research.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"41 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139968943","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-22DOI: 10.1007/s12369-024-01108-2
Rae Yule Kim
Humanoid store staff is increasingly common. Many retailers rely on automation to address the mounting pressure for cost savings. Meanwhile, humanoid store staff can influence customer experience negatively. People feel stressed and eerie about interacting with humanoids. In this article, I investigate one way to improve people’s attitudes toward humanoids. People’s negative attitudes towards humanoids, such as fear or a sense of eeriness, can be substantially mitigated when they are in a crowded environment. People in a crowded environment do not fear robots or find robots uncanny while people in a relatively uncrowded environment do. I refer to this phenomenon as the “crowd effect.” Being in a crowded environment substantially changes how people evaluate and respond to humanoids. The effect is mediated by risk aversion tendency.
{"title":"Being in a Crowd Shifts People’s Attitudes Toward Humanoids","authors":"Rae Yule Kim","doi":"10.1007/s12369-024-01108-2","DOIUrl":"https://doi.org/10.1007/s12369-024-01108-2","url":null,"abstract":"<p>Humanoid store staff is increasingly common. Many retailers rely on automation to address the mounting pressure for cost savings. Meanwhile, humanoid store staff can influence customer experience negatively. People feel stressed and eerie about interacting with humanoids. In this article, I investigate one way to improve people’s attitudes toward humanoids. People’s negative attitudes towards humanoids, such as fear or a sense of eeriness, can be substantially mitigated when they are in a crowded environment. People in a crowded environment do not fear robots or find robots uncanny while people in a relatively uncrowded environment do. I refer to this phenomenon as the “crowd effect.” Being in a crowded environment substantially changes how people evaluate and respond to humanoids. The effect is mediated by risk aversion tendency.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"8 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139949259","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-21DOI: 10.1007/s12369-024-01106-4
María Trinidad Rodríguez-Domínguez, María Isabel Bazago-Dómine, María Jiménez-Palomares, Gerardo Pérez-González, Pedro Núñez, Esperanza Santano-Mogena, Elisa María Garrido-Ardila
As dementia-induced impairments of daily functioning escalate, novel cognitive stimulation techniques utilizing technological advances, like social robots, have surfaced. This study examines the interaction level of the EBO social-care robot with day center patients in Cáceres, Extremadura, Spain. The study uses systematic video analysis as a method of interaction assessment. This observational pilot study was performed on patients above 65 with mild to moderate cognitive impairment (Minimental State Examination (ge ) 21) receiving cognitive therapy at the AZTIDE social and health center. Two individualized 10–15 min sessions, replicating the Wizard of Oz technique, were conducted per participant, with the human operator’s commands being unnoticeably executed by the EBO robot. Of the six participants involved, all maintained complete eye contact with the robot, with 83.3(%) of the interactions recording maximum attention. Participants felt comfortable and calm, rating conversational factors such as attentiveness and naturalness as ’good’ or ‘excellent’. The high interaction level with the EBO robot suggests it as a promising tool for cognitive stimulation in patients with mild to moderate cognitive impairment. The systematic video evaluation also appears effective in assessing user–robot interaction, thus underscoring its potential utility in future social robotics research.
{"title":"Interaction Assessment of a Social-Care Robot in Day center Patients with Mild to Moderate Cognitive Impairment: A Pilot Study","authors":"María Trinidad Rodríguez-Domínguez, María Isabel Bazago-Dómine, María Jiménez-Palomares, Gerardo Pérez-González, Pedro Núñez, Esperanza Santano-Mogena, Elisa María Garrido-Ardila","doi":"10.1007/s12369-024-01106-4","DOIUrl":"https://doi.org/10.1007/s12369-024-01106-4","url":null,"abstract":"<p>As dementia-induced impairments of daily functioning escalate, novel cognitive stimulation techniques utilizing technological advances, like social robots, have surfaced. This study examines the interaction level of the EBO social-care robot with day center patients in Cáceres, Extremadura, Spain. The study uses systematic video analysis as a method of interaction assessment. This observational pilot study was performed on patients above 65 with mild to moderate cognitive impairment (Minimental State Examination <span>(ge )</span> 21) receiving cognitive therapy at the AZTIDE social and health center. Two individualized 10–15 min sessions, replicating the Wizard of Oz technique, were conducted per participant, with the human operator’s commands being unnoticeably executed by the EBO robot. Of the six participants involved, all maintained complete eye contact with the robot, with 83.3<span>(%)</span> of the interactions recording maximum attention. Participants felt comfortable and calm, rating conversational factors such as attentiveness and naturalness as ’good’ or ‘excellent’. The high interaction level with the EBO robot suggests it as a promising tool for cognitive stimulation in patients with mild to moderate cognitive impairment. The systematic video evaluation also appears effective in assessing user–robot interaction, thus underscoring its potential utility in future social robotics research.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"30 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139949344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-19DOI: 10.1007/s12369-024-01098-1
Eileen Roesler, Sophie Rudolph, Felix Wilhelm Siebert
Personal assistance robots are making inroads into our private and public life. At the same time, most humans are still unfamiliar with this technology and hesitate to accept and use it for daily tasks. Fortunately, the designs of robots can be adjusted to yield greater acceptance, subsequently enabling their utilization across various tasks. Using a scenario-based online experiment, we explored how sociability (low vs. high), ownership (private vs. public), and affinity for technology influence the acceptance and intention to use a robot for grocery shopping. Moreover, to assess users’ preference for robots’ morphology, participants were asked to choose a robot (technical vs. anthropomorphic design) that they would prefer to use in a supermarket. We found that low sociability of the service robot and a higher affective affinity for technology led to a higher level of acceptance. For more sociable robots, higher levels of anthropomorphism were preferred. Our results point to the importance of task-specific robot design that exceeds functional considerations.
{"title":"Exploring the Role of Sociability, Ownership, and Affinity for Technology in Shaping Acceptance and Intention to Use Personal Assistance Robots.","authors":"Eileen Roesler, Sophie Rudolph, Felix Wilhelm Siebert","doi":"10.1007/s12369-024-01098-1","DOIUrl":"https://doi.org/10.1007/s12369-024-01098-1","url":null,"abstract":"<p>Personal assistance robots are making inroads into our private and public life. At the same time, most humans are still unfamiliar with this technology and hesitate to accept and use it for daily tasks. Fortunately, the designs of robots can be adjusted to yield greater acceptance, subsequently enabling their utilization across various tasks. Using a scenario-based online experiment, we explored how sociability (low vs. high), ownership (private vs. public), and affinity for technology influence the acceptance and intention to use a robot for grocery shopping. Moreover, to assess users’ preference for robots’ morphology, participants were asked to choose a robot (technical vs. anthropomorphic design) that they would prefer to use in a supermarket. We found that low sociability of the service robot and a higher affective affinity for technology led to a higher level of acceptance. For more sociable robots, higher levels of anthropomorphism were preferred. Our results point to the importance of task-specific robot design that exceeds functional considerations.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"22 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139918623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-17DOI: 10.1007/s12369-024-01105-5
Joel Currie, Maria Elena Giannaccini, Patric Bach
For efficient human–robot interaction, human operators need to be able to efficiently represent the robot’s movements in space and predict its next steps. However, according to frameworks of Bayesian multisensory integration, features outside the motion itself—like the sounds a robot makes while it moves—should affect how otherwise identical motions are perceived. Here, we translate an established psychophysical task from experimental psychology to a human–robot interaction context, which can measure these distortions to motion perception. In two series of preregistered studies, participants watched a humanoid robot make forward and backward reaching movements. When the robot hand suddenly disappeared, they reported its last seen location, either with the mouse cursor (Experiment 1a and 1b) or by matching it to probe stimuli in different locations (Experiment 2a and 2b). The results revealed that even small changes to the robot’s sound robustly affect participants’ visuospatial representation of its motions, so that the motion appeared to extend further in space when accompanied by slightly (100 ms) longer sounds compared to slightly shorter sounds (100 ms shorter). Moreover, these sound changes do not only affect where people currently locate the robot’s motion, but where they anticipate its future steps. These findings show that sound design is an effective medium for manipulating how people represent otherwise identical robot actions and coordinate its interactions with it. The study acts as proof of concept that psychophysical tasks provide a promising tool to measure how design parameters influence the perception and prediction of robot motion.
{"title":"Sonic Sleight of Hand: Sound Induces Illusory Distortions in the Perception and Prediction of Robot Action","authors":"Joel Currie, Maria Elena Giannaccini, Patric Bach","doi":"10.1007/s12369-024-01105-5","DOIUrl":"https://doi.org/10.1007/s12369-024-01105-5","url":null,"abstract":"<p>For efficient human–robot interaction, human operators need to be able to efficiently represent the robot’s movements in space and predict its next steps. However, according to frameworks of Bayesian multisensory integration, features outside the motion itself—like the sounds a robot makes while it moves—should affect how otherwise identical motions are perceived. Here, we translate an established psychophysical task from experimental psychology to a human–robot interaction context, which can measure these distortions to motion perception. In two series of preregistered studies, participants watched a humanoid robot make forward and backward reaching movements. When the robot hand suddenly disappeared, they reported its last seen location, either with the mouse cursor (Experiment 1a and 1b) or by matching it to probe stimuli in different locations (Experiment 2a and 2b). The results revealed that even small changes to the robot’s sound robustly affect participants’ visuospatial representation of its motions, so that the motion appeared to extend further in space when accompanied by slightly (100 ms) longer sounds compared to slightly shorter sounds (100 ms shorter). Moreover, these sound changes do not only affect where people currently locate the robot’s motion, but where they anticipate its future steps. These findings show that sound design is an effective medium for manipulating how people represent otherwise identical robot actions and coordinate its interactions with it. The study acts as proof of concept that psychophysical tasks provide a promising tool to measure how design parameters influence the perception and prediction of robot motion.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"23 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139759425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-16DOI: 10.1007/s12369-024-01101-9
Benjamin Camblor, David Daney, Lucas Joseph, Jean-Marc Salotti
The link between situation awareness (SA) and the distribution of human attention, has been explored within a human robot collaboration framework. According to Endsley (1995), SA is divided into three levels: perception, comprehension and projection. It is involved in the process of making decisions and carrying out actions in a dynamic environment. This work investigates three hypotheses. First, that the ability to project a robot’s future actions improves performance in a collaborative task. Second, that the more participants are involved in tasks in a collaborative environment, the better their SA will be. Finally, that the use of a robot’s non-verbal communication motions attracts a participant’s attention more promptly than if the robot remains motionless. A within-participants study has been designed to investigate our three hypotheses. Participants were asked to perform a collaborative task with a robot. It required them to assist the robot at different moments while they were engaged in a distracting task that was catching their attention (tower of Hanoi puzzle). These moments could either be anticipated and taken into account in the human decision-making and action loop or not. Lastly, the robot could either use non-verbal communication gestures to draw human attention or not. The results have demonstrated the significance of considering the human capability to project a robot next actions in their own personal attention management. Moreover, the subjective measures showed no difference in the assessment of SA, in contrast to the objective measures, which are in line with our second hypothesis. Finally, it seems that standing stationary can be considered a gesture of non-verbal communication. In the present work, robot waiting was more salient in capturing human attention when the robot remained motionless rather than making a signaling motion.
人们在人类机器人协作框架内探讨了情境意识(SA)与人类注意力分配之间的联系。根据 Endsley(1995 年)的观点,情景意识分为三个层次:感知、理解和预测。它涉及在动态环境中做出决策和执行行动的过程。本研究提出了三个假设。首先,预测机器人未来行动的能力会提高机器人在协作任务中的表现。其次,参与协作环境任务的人数越多,他们的 SA 能力就越强。最后,使用机器人的非语言交流动作比机器人一动不动更能迅速吸引参与者的注意力。为了研究我们的三个假设,我们设计了一项参与者内部研究。参与者被要求与机器人共同完成一项协作任务。该任务要求参与者在不同的时刻协助机器人,而此时他们正在从事一项分散注意力的任务(河内塔拼图)。在人类决策和行动循环中,这些时刻既可以被预测和考虑到,也可以不被预测和考虑到。最后,机器人可以使用非语言交流手势来吸引人类的注意力,也可以不这样做。结果表明,考虑人类在个人注意力管理中预测机器人下一步行动的能力具有重要意义。此外,主观测量结果表明,与客观测量结果相比,对 SA 的评估没有差异,这符合我们的第二个假设。最后,站立不动似乎可以被视为一种非语言交流的姿态。在本研究中,当机器人一动不动而不是做出示意动作时,机器人的等待更能吸引人类的注意力。
{"title":"Attention Sharing Handling Through Projection Capability Within Human–Robot Collaboration","authors":"Benjamin Camblor, David Daney, Lucas Joseph, Jean-Marc Salotti","doi":"10.1007/s12369-024-01101-9","DOIUrl":"https://doi.org/10.1007/s12369-024-01101-9","url":null,"abstract":"<p>The link between situation awareness (SA) and the distribution of human attention, has been explored within a human robot collaboration framework. According to Endsley (1995), SA is divided into three levels: perception, comprehension and projection. It is involved in the process of making decisions and carrying out actions in a dynamic environment. This work investigates three hypotheses. First, that the ability to project a robot’s future actions improves performance in a collaborative task. Second, that the more participants are involved in tasks in a collaborative environment, the better their SA will be. Finally, that the use of a robot’s non-verbal communication motions attracts a participant’s attention more promptly than if the robot remains motionless. A within-participants study has been designed to investigate our three hypotheses. Participants were asked to perform a collaborative task with a robot. It required them to assist the robot at different moments while they were engaged in a distracting task that was catching their attention (tower of Hanoi puzzle). These moments could either be anticipated and taken into account in the human decision-making and action loop or not. Lastly, the robot could either use non-verbal communication gestures to draw human attention or not. The results have demonstrated the significance of considering the human capability to project a robot next actions in their own personal attention management. Moreover, the subjective measures showed no difference in the assessment of SA, in contrast to the objective measures, which are in line with our second hypothesis. Finally, it seems that standing stationary can be considered a gesture of non-verbal communication. In the present work, robot waiting was more salient in capturing human attention when the robot remained motionless rather than making a signaling motion.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"5 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2024-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139759422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}