Pub Date : 2023-12-16DOI: 10.1007/s12369-023-01064-3
Carla S. Jakobowsky, Anna M. H. Abrams, Astrid M. Rosenthal-von der Pütten
Delivery robots and personal cargo robots are increasingly sharing space with incidentally co-present persons (InCoPs) on pedestrian ways facing the challenge of socially adequate and safe navigation. Humans are able to effortlessly negotiate this shared space by signalling their skirting intentions via non-verbal gaze cues. In two online-experiments we investigated whether this phenomenon of gaze cuing can be transferred to human–robot interaction. In the first study, participants (n = 92) watched short videos in which either a human, a humanoid robot or a non-humanoid delivery robot moved towards the camera. In each video, the counterpart looked either straight towards the camera or did an eye movement to the right or left. The results showed that when the counterpart gaze cued to their left, also participants skirted more often to the left from their perspective, thereby walking past each other and avoiding collision. Since the participants were recruited in a right-hand driving country we replicated the study in left-hand driving countries (n = 176). Results showed that participants skirted more often to the right when the counterpart gaze cued to the right, and to the left in case of eye movements to the left, expanding our previous result. In both studies, skirting behavior did not differ regarding the type of counterpart. Hence, gaze cues increase the chance to trigger complementary skirting behavior in InCoPs independently of the robot morphology. Equipping robots with eyes can help to indicate moving direction by gaze cues and thereby improve interactions between humans and robots on pedestrian ways.
{"title":"Gaze-Cues of Humans and Robots on Pedestrian Ways","authors":"Carla S. Jakobowsky, Anna M. H. Abrams, Astrid M. Rosenthal-von der Pütten","doi":"10.1007/s12369-023-01064-3","DOIUrl":"https://doi.org/10.1007/s12369-023-01064-3","url":null,"abstract":"<p>Delivery robots and personal cargo robots are increasingly sharing space with incidentally co-present persons (InCoPs) on pedestrian ways facing the challenge of socially adequate and safe navigation. Humans are able to effortlessly negotiate this shared space by signalling their skirting intentions via non-verbal gaze cues. In two online-experiments we investigated whether this phenomenon of gaze cuing can be transferred to human–robot interaction. In the first study, participants (<i>n</i> = 92) watched short videos in which either a human, a humanoid robot or a non-humanoid delivery robot moved towards the camera. In each video, the counterpart looked either straight towards the camera or did an eye movement to the right or left. The results showed that when the counterpart gaze cued to their left, also participants skirted more often to the left from their perspective, thereby walking past each other and avoiding collision. Since the participants were recruited in a right-hand driving country we replicated the study in left-hand driving countries (<i>n</i> = 176). Results showed that participants skirted more often to the right when the counterpart gaze cued to the right, and to the left in case of eye movements to the left, expanding our previous result. In both studies, skirting behavior did not differ regarding the type of counterpart. Hence, gaze cues increase the chance to trigger complementary skirting behavior in InCoPs independently of the robot morphology. Equipping robots with eyes can help to indicate moving direction by gaze cues and thereby improve interactions between humans and robots on pedestrian ways.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"20 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138683896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-14DOI: 10.1007/s12369-023-01081-2
M. Cappuccio, Jai C. Galliott, Friederike Eyssel, Alessandro Lanteri
{"title":"Correction to: Autonomous Systems and Technology Resistance: New Tools for Monitoring Acceptance, Trust, and Tolerance","authors":"M. Cappuccio, Jai C. Galliott, Friederike Eyssel, Alessandro Lanteri","doi":"10.1007/s12369-023-01081-2","DOIUrl":"https://doi.org/10.1007/s12369-023-01081-2","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"150 1","pages":"1"},"PeriodicalIF":4.7,"publicationDate":"2023-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139180002","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-08DOI: 10.1007/s12369-023-01087-w
Matthew Green, Dzung Dao, Wendy Moyle
Socially assistive robots (SARs) have shown promise in the care of people with dementia and in mitigating behavioural and psychological symptoms of dementia. Although SARs are continually tested for efficacy, no current literature outlines a comprehensive strategy that industrial designers may employ to progress the technology of SARs. It was, therefore, essential to expand on existing literature by providing a straightforward approach to SAR design with the recommended design attributes. A systematic review was conducted to formulate recommendations for designing SARs to improve the quality of life of people with dementia. Six databases, including CINAHL, Embase, IEEE, Medline, ProQuest, and Scopus, were searched for relevant articles published between 2011 and 2022. Covidence software was used for screening, data extraction and quality testing. Of the 160 references extracted, 16 studies met the study inclusion criteria. The studies were predominately small sample sizes using various robotic platforms and technologies. Incorporating personal preferences linked to a user’s life experience and choice is a crucial ability of SARs. Natural speech communication is also an important design attribute. However, the overwhelming conclusion is that more research is needed on aesthetics, materials, and interaction capabilities. All stakeholders should be part of a holistic user-centred design process to ensure a fit-for-purpose product.
社交辅助机器人(SARs)在护理痴呆症患者以及减轻痴呆症的行为和心理症状方面大有可为。虽然人们一直在测试 SAR 的功效,但目前没有任何文献概述了工业设计师可用于推动 SAR 技术发展的综合策略。因此,有必要在现有文献的基础上进行扩展,为 SAR 的设计提供一个简单明了的方法,并推荐设计属性。我们进行了一项系统性综述,旨在为提高痴呆症患者的生活质量提出设计合成孔径雷达的建议。研究人员在 CINAHL、Embase、IEEE、Medline、ProQuest 和 Scopus 等六个数据库中检索了 2011 年至 2022 年间发表的相关文章。使用 Covidence 软件进行筛选、数据提取和质量检测。在提取的 160 篇参考文献中,有 16 项研究符合研究纳入标准。这些研究主要采用不同的机器人平台和技术,样本量较小。将与用户生活经验和选择相关联的个人喜好纳入其中是合成孔径雷达的一项重要能力。自然语音交流也是一项重要的设计属性。然而,压倒性的结论是,还需要对美学、材料和交互能力进行更多的研究。所有利益相关者都应参与以用户为中心的整体设计过程,以确保产品的适用性。
{"title":"Design Attributes of Socially Assistive Robots for People with Dementia: A Systematic Review","authors":"Matthew Green, Dzung Dao, Wendy Moyle","doi":"10.1007/s12369-023-01087-w","DOIUrl":"https://doi.org/10.1007/s12369-023-01087-w","url":null,"abstract":"<p>Socially assistive robots (SARs) have shown promise in the care of people with dementia and in mitigating behavioural and psychological symptoms of dementia. Although SARs are continually tested for efficacy, no current literature outlines a comprehensive strategy that industrial designers may employ to progress the technology of SARs. It was, therefore, essential to expand on existing literature by providing a straightforward approach to SAR design with the recommended design attributes. A systematic review was conducted to formulate recommendations for designing SARs to improve the quality of life of people with dementia. Six databases, including CINAHL, Embase, IEEE, Medline, ProQuest, and Scopus, were searched for relevant articles published between 2011 and 2022. Covidence software was used for screening, data extraction and quality testing. Of the 160 references extracted, 16 studies met the study inclusion criteria. The studies were predominately small sample sizes using various robotic platforms and technologies. Incorporating personal preferences linked to a user’s life experience and choice is a crucial ability of SARs. Natural speech communication is also an important design attribute. However, the overwhelming conclusion is that more research is needed on aesthetics, materials, and interaction capabilities. All stakeholders should be part of a holistic user-centred design process to ensure a fit-for-purpose product.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"19 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138553206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We designed a coordination–cooperation game dedicated to teaching the theory of mind (ToM) to children with autism spectrum disorder. Children interacted with either a robot or a human. They had to coordinate their gestures with the beats of a ditty sung by their partner (coordination), who then implicitly asked them for help (cooperation). Before and after this cooperation–coordination task, the children performed a helping task that assessed their ToM skills: the ability to infer social partners’ intentions. Despite the regularity and predictability of the robot, children made the most progress in the helping task after interacting with a human. Motor coupling was more stable in child–human than in child–robot dyads. The ability of the social partner to actively maintain a stable social coupling seems to be a primary factor inciting the child to learn and transfer the just-practiced social skills.
{"title":"Does the Social Robot Nao Facilitate Cooperation in High Functioning Children with ASD?","authors":"Viviane Kostrubiec, Chloé Lajunta, Pierre-Vincent Paubel, Jeanne Kruck","doi":"10.1007/s12369-023-01063-4","DOIUrl":"https://doi.org/10.1007/s12369-023-01063-4","url":null,"abstract":"<p>We designed a coordination–cooperation game dedicated to teaching the theory of mind (ToM) to children with autism spectrum disorder. Children interacted with either a robot or a human. They had to coordinate their gestures with the beats of a ditty sung by their partner (coordination), who then implicitly asked them for help (cooperation). Before and after this cooperation–coordination task, the children performed a helping task that assessed their ToM skills: the ability to infer social partners’ intentions. Despite the regularity and predictability of the robot, children made the most progress in the helping task after interacting with a human. Motor coupling was more stable in child–human than in child–robot dyads. The ability of the social partner to actively maintain a stable social coupling seems to be a primary factor inciting the child to learn and transfer the just-practiced social skills.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"47 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138543758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-05DOI: 10.1007/s12369-023-01070-5
Yuya Onishi, Hidenobu Sumioka, Masahiro Shiomi
Although whole-body touch interaction, e.g., hugging, is essential for human beings from various perspectives, not everyone can interact with intimate friends/family due to physical separations caused by such circumstances as pandemics, geographical constraints, etc. The possibility of human–robot touch interaction is one approach that ameliorates such missing touch interactions. In this study, we developed a robot named Moffuly-II, that hugs people and rubs their heads during a hug because head-touching behaviors are typical affective interactions between intimate persons. Moffuly-II is a large huggable teddy-bear type robot and it has enough capability to both hug and touch the head. We conducted an experiment with human participants and evaluated the effectiveness of combining intra-hug gestures (squeezing and rubbing) and the touch area (back and head). From experimental results, we identified the advantages of implementing rubbing gestures compared to squeezing gestures and some of the advantages of head-touching behaviors compared to back-touching behaviors.
{"title":"Moffuly-II: A Robot that Hugs and Rubs Heads","authors":"Yuya Onishi, Hidenobu Sumioka, Masahiro Shiomi","doi":"10.1007/s12369-023-01070-5","DOIUrl":"https://doi.org/10.1007/s12369-023-01070-5","url":null,"abstract":"<p>Although whole-body touch interaction, e.g., hugging, is essential for human beings from various perspectives, not everyone can interact with intimate friends/family due to physical separations caused by such circumstances as pandemics, geographical constraints, etc. The possibility of human–robot touch interaction is one approach that ameliorates such missing touch interactions. In this study, we developed a robot named Moffuly-II, that hugs people and rubs their heads during a hug because head-touching behaviors are typical affective interactions between intimate persons. Moffuly-II is a large huggable teddy-bear type robot and it has enough capability to both hug and touch the head. We conducted an experiment with human participants and evaluated the effectiveness of combining intra-hug gestures (squeezing and rubbing) and the touch area (back and head). From experimental results, we identified the advantages of implementing rubbing gestures compared to squeezing gestures and some of the advantages of head-touching behaviors compared to back-touching behaviors.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"875 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-04DOI: 10.1007/s12369-023-01084-z
Francesco Bianchini, Luisa Damiano, E. Datteri, Pierluigi Graziani
{"title":"Experimental and Integrative Approaches to Robo-ethics. An Introduction","authors":"Francesco Bianchini, Luisa Damiano, E. Datteri, Pierluigi Graziani","doi":"10.1007/s12369-023-01084-z","DOIUrl":"https://doi.org/10.1007/s12369-023-01084-z","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"61 3","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138604881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-04DOI: 10.1007/s12369-023-01052-7
Robert Sparrow, Eliana Horn, Friederike Eyssel
Research in Human–Robot Interaction (HRI) suggests that people attribute gender to (some) robots. In this paper we outline a program of research on the gendering of robots and on the ethical issues raised by such gendering. Understanding which robots are gendered, when, and why, will require careful research in HRI, drawing on anthropology and social psychology, informed by state-of-the-art research in gender studies and critical theory. Design features of robots that might influence the attribution of gender include: appearance; tone of voice; speech repertoire; range and style of movement; behaviour; and, intended function. Robots may be gendered differently depending on: the age, class, sex, ethnicity, and sexuality of the person doing the attributing; local cultural histories; social cues from the designers, the physical and institutional environment, and other users; and the role of the robot. An adequate account of the gender of robots will also need to pay attention to the limits of a sex/gender distinction, which has historically been maintained by reference to a “sex” located in a biological body, when it comes to theorising the gender of robots. We argue that, on some accounts of what it is to be sexed, robots might “have” sex: they might be male and female in just the same way as (most) human beings are. Addressing the ethical issues raised by the gendering of robots will require further progress in “robot media ethics”, as well as an account of the responsibilities of both designers and users in a broader social context.
{"title":"Do Robots Have Sex? A Prolegomenon","authors":"Robert Sparrow, Eliana Horn, Friederike Eyssel","doi":"10.1007/s12369-023-01052-7","DOIUrl":"https://doi.org/10.1007/s12369-023-01052-7","url":null,"abstract":"<p>Research in Human–Robot Interaction (HRI) suggests that people attribute gender to (some) robots. In this paper we outline a program of research on the gendering of robots and on the ethical issues raised by such gendering. Understanding which robots are gendered, when, and why, will require careful research in HRI, drawing on anthropology and social psychology, informed by state-of-the-art research in gender studies and critical theory. Design features of robots that might influence the attribution of gender include: appearance; tone of voice; speech repertoire; range and style of movement; behaviour; and, intended function. Robots may be gendered differently depending on: the age, class, sex, ethnicity, and sexuality of the person doing the attributing; local cultural histories; social cues from the designers, the physical and institutional environment, and other users; and the role of the robot. An adequate account of the gender of robots will also need to pay attention to the limits of a sex/gender distinction, which has historically been maintained by reference to a “sex” located in a biological body, when it comes to theorising the gender of robots. We argue that, on some accounts of what it is to be sexed, robots might “have” sex: they might be male and female in just the same way as (most) human beings are. Addressing the ethical issues raised by the gendering of robots will require further progress in “robot media ethics”, as well as an account of the responsibilities of both designers and users in a broader social context.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"36 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1007/s12369-022-00890-1
Laura Corti, Nicola Di Stefano, Marta Bertolaso
{"title":"Artificial Emotions: Toward a Human-Centric Ethics","authors":"Laura Corti, Nicola Di Stefano, Marta Bertolaso","doi":"10.1007/s12369-022-00890-1","DOIUrl":"https://doi.org/10.1007/s12369-022-00890-1","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"40 4","pages":"2039-2053"},"PeriodicalIF":4.7,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139187759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-30DOI: 10.1007/s12369-023-01076-z
Guy Laban, Arvid Kappas, Val Morrison, Emily S. Cross
While interactions with social robots are novel and exciting for many people, one concern is the extent to which people’s behavioural and emotional engagement might be sustained across time, since during initial interactions with a robot, its novelty is especially salient. This challenge is particularly noteworthy when considering interactions designed to support people’s well-being, with limited evidence (or empirical exploration) of social robots’ capacity to support people’s emotional health over time. Accordingly, our aim here was to examine how long-term repeated interactions with a social robot affect people’s self-disclosure behaviour toward the robot, their perceptions of the robot, and how such sustained interactions influence factors related to well-being. We conducted a mediated long-term online experiment with participants conversing with the social robot Pepper 10 times over 5 weeks. We found that people self-disclose increasingly more to a social robot over time, and report the robot to be more social and competent over time. Participants’ moods also improved after talking to the robot, and across sessions, they found the robot’s responses increasingly comforting as well as reported feeling less lonely. Finally, our results emphasize that when the discussion frame was supposedly more emotional (in this case, framing questions in the context of the COVID-19 pandemic), participants reported feeling lonelier and more stressed. These results set the stage for situating social robots as conversational partners and provide crucial evidence for their potential inclusion in interventions supporting people’s emotional health through encouraging self-disclosure.
{"title":"Building Long-Term Human–Robot Relationships: Examining Disclosure, Perception and Well-Being Across Time","authors":"Guy Laban, Arvid Kappas, Val Morrison, Emily S. Cross","doi":"10.1007/s12369-023-01076-z","DOIUrl":"https://doi.org/10.1007/s12369-023-01076-z","url":null,"abstract":"<p>While interactions with social robots are novel and exciting for many people, one concern is the extent to which people’s behavioural and emotional engagement might be sustained across time, since during initial interactions with a robot, its novelty is especially salient. This challenge is particularly noteworthy when considering interactions designed to support people’s well-being, with limited evidence (or empirical exploration) of social robots’ capacity to support people’s emotional health over time. Accordingly, our aim here was to examine how long-term repeated interactions with a social robot affect people’s self-disclosure behaviour toward the robot, their perceptions of the robot, and how such sustained interactions influence factors related to well-being. We conducted a mediated long-term online experiment with participants conversing with the social robot Pepper 10 times over 5 weeks. We found that people self-disclose increasingly more to a social robot over time, and report the robot to be more social and competent over time. Participants’ moods also improved after talking to the robot, and across sessions, they found the robot’s responses increasingly comforting as well as reported feeling less lonely. Finally, our results emphasize that when the discussion frame was supposedly more emotional (in this case, framing questions in the context of the COVID-19 pandemic), participants reported feeling lonelier and more stressed. These results set the stage for situating social robots as conversational partners and provide crucial evidence for their potential inclusion in interventions supporting people’s emotional health through encouraging self-disclosure.\u0000</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"875 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-29DOI: 10.1007/s12369-023-01079-w
Ori Fartook, Karon MacLean, Tal Oron-Gilad, Jessica R. Cauchard
The field of human–drone interaction (HDI) has investigated an increasing number of applications for social drones, all while focusing on the drone’s inherent ability to fly, thus overpassing interaction opportunities, such as a drone in its perched (i.e., non-flying) state. A drone cannot constantly fly and a need for more realistic HDI is needed, therefore, in this exploratory work, we have decoupled a social drone’s flying state from its perched state and investigated user interpretations of its physical rendering. To do so, we designed and developed BiRDe: a Bodily expressIons and Respiration Drone conveying Emotions. BiRDe was designed to render a range of emotional states by modulating its respiratory rate (RR) and changing its body posture using reconfigurable wings and head positions. Following its design, a validation study was conducted. In a laboratory study, participants (({N}={30})) observed and labeled twelve of BiRDe’s emotional behaviors using Valence and Arousal based emotional states. We identified consistent patterns in how BiRDe’s RR, wings, and head had influenced perception in terms of valence, arousal, and willingness to interact. Furthermore, participants interpreted 11 out of the 12 behaviors in line with our initial design intentions. This work demonstrates a drone’s ability to communicate emotions even while perched and offers design implications and future applications.
{"title":"Expanding the Interaction Repertoire of a Social Drone: Physically Expressive Possibilities of a Perched BiRDe","authors":"Ori Fartook, Karon MacLean, Tal Oron-Gilad, Jessica R. Cauchard","doi":"10.1007/s12369-023-01079-w","DOIUrl":"https://doi.org/10.1007/s12369-023-01079-w","url":null,"abstract":"<p>The field of human–drone interaction (HDI) has investigated an increasing number of applications for social drones, all while focusing on the drone’s inherent ability to fly, thus overpassing interaction opportunities, such as a drone in its perched (i.e., non-flying) state. A drone cannot constantly fly and a need for more realistic HDI is needed, therefore, in this exploratory work, we have decoupled a social drone’s flying state from its perched state and investigated user interpretations of its physical rendering. To do so, we designed and developed BiRDe: a Bodily expressIons and Respiration Drone conveying Emotions. BiRDe was designed to render a range of emotional states by modulating its respiratory rate (RR) and changing its body posture using reconfigurable wings and head positions. Following its design, a validation study was conducted. In a laboratory study, participants (<span>({N}={30})</span>) observed and labeled twelve of BiRDe’s emotional behaviors using Valence and Arousal based emotional states. We identified consistent patterns in how BiRDe’s RR, wings, and head had influenced perception in terms of valence, arousal, and willingness to interact. Furthermore, participants interpreted 11 out of the 12 behaviors in line with our initial design intentions. This work demonstrates a drone’s ability to communicate emotions even while perched and offers design implications and future applications.\u0000</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"875 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}