{"title":"基于表情情感强度因子推导的自动检疫机器人人机交互面部表情优化设计","authors":"Hyeseung Jeong, Seongsoo Hong","doi":"10.15187/adr.2023.08.36.3.317","DOIUrl":null,"url":null,"abstract":"Background To develop a robot that shares emotions with humans, it is necessary to study how robots can express appropriate emotions in various situations. As interest in sterilization to prevent infectious diseases increases worldwide, the development of autonomous quarantine robots that are capable of not only quarantine tasks but also emotional exchanges with humans is of interest. Therefore, this study describes the optimization of a human-robot interaction (HRI) design that facilitates the interaction between humans and autonomous quarantine robots by designing and developing the facial expressions of robots so that they can express emotions. Methods We created a user-robot integration scenario to derive emotional information according to the situation of the robot. Then, we selected the facial expression of the robot preferred by the user through a survey and designed the robot's basic expression based on the Action Units (AU) combination. We derived elements to modulate the intensity of robot emotions based on previous research and human facial expression verification experiments. Finally, we applied the optimal emotional intensity factors derived from the analysis of videos showing the facial expressions of a human actor to the facial expressions of the robot. Results We derived the optimal emotional intensity factors for producing the seven emotions of the autonomous quarantine robot and presented them in a graph. As a result, we found the values that can express the optimal intensity of the seven emotions. In addition, we verified that speed, duration, eye blinking, and color were appropriate for producing the desired emotional intensity by the robot. In addition, we developed a facial expression design that can express the optimal emotion of the robot through the appropriate emotional intensity. Conclusions In this study, we add the situations of the user and the robot to the standardized emotional expression of the robot and derive elements to express the intensity of the emotion. Through this, the robot can effectively express its emotions and intentions in various environments and can interact smoothly with humans. In addition, we provide suggestions for developing optimized robots using basic data so that robot manufacturers and robot experts can implement an emotional system that adds flexibility to robot behavior.","PeriodicalId":52137,"journal":{"name":"Archives of Design Research","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Optimal Human-Robot Interaction Facial Expression Design Based on Derivation of Expression Emotional Intensity Factors of an Autonomous Quarantine Robot\",\"authors\":\"Hyeseung Jeong, Seongsoo Hong\",\"doi\":\"10.15187/adr.2023.08.36.3.317\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Background To develop a robot that shares emotions with humans, it is necessary to study how robots can express appropriate emotions in various situations. As interest in sterilization to prevent infectious diseases increases worldwide, the development of autonomous quarantine robots that are capable of not only quarantine tasks but also emotional exchanges with humans is of interest. Therefore, this study describes the optimization of a human-robot interaction (HRI) design that facilitates the interaction between humans and autonomous quarantine robots by designing and developing the facial expressions of robots so that they can express emotions. Methods We created a user-robot integration scenario to derive emotional information according to the situation of the robot. Then, we selected the facial expression of the robot preferred by the user through a survey and designed the robot's basic expression based on the Action Units (AU) combination. We derived elements to modulate the intensity of robot emotions based on previous research and human facial expression verification experiments. Finally, we applied the optimal emotional intensity factors derived from the analysis of videos showing the facial expressions of a human actor to the facial expressions of the robot. Results We derived the optimal emotional intensity factors for producing the seven emotions of the autonomous quarantine robot and presented them in a graph. As a result, we found the values that can express the optimal intensity of the seven emotions. In addition, we verified that speed, duration, eye blinking, and color were appropriate for producing the desired emotional intensity by the robot. In addition, we developed a facial expression design that can express the optimal emotion of the robot through the appropriate emotional intensity. Conclusions In this study, we add the situations of the user and the robot to the standardized emotional expression of the robot and derive elements to express the intensity of the emotion. Through this, the robot can effectively express its emotions and intentions in various environments and can interact smoothly with humans. In addition, we provide suggestions for developing optimized robots using basic data so that robot manufacturers and robot experts can implement an emotional system that adds flexibility to robot behavior.\",\"PeriodicalId\":52137,\"journal\":{\"name\":\"Archives of Design Research\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-08-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Archives of Design Research\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.15187/adr.2023.08.36.3.317\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Arts and Humanities\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Archives of Design Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15187/adr.2023.08.36.3.317","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Arts and Humanities","Score":null,"Total":0}
Optimal Human-Robot Interaction Facial Expression Design Based on Derivation of Expression Emotional Intensity Factors of an Autonomous Quarantine Robot
Background To develop a robot that shares emotions with humans, it is necessary to study how robots can express appropriate emotions in various situations. As interest in sterilization to prevent infectious diseases increases worldwide, the development of autonomous quarantine robots that are capable of not only quarantine tasks but also emotional exchanges with humans is of interest. Therefore, this study describes the optimization of a human-robot interaction (HRI) design that facilitates the interaction between humans and autonomous quarantine robots by designing and developing the facial expressions of robots so that they can express emotions. Methods We created a user-robot integration scenario to derive emotional information according to the situation of the robot. Then, we selected the facial expression of the robot preferred by the user through a survey and designed the robot's basic expression based on the Action Units (AU) combination. We derived elements to modulate the intensity of robot emotions based on previous research and human facial expression verification experiments. Finally, we applied the optimal emotional intensity factors derived from the analysis of videos showing the facial expressions of a human actor to the facial expressions of the robot. Results We derived the optimal emotional intensity factors for producing the seven emotions of the autonomous quarantine robot and presented them in a graph. As a result, we found the values that can express the optimal intensity of the seven emotions. In addition, we verified that speed, duration, eye blinking, and color were appropriate for producing the desired emotional intensity by the robot. In addition, we developed a facial expression design that can express the optimal emotion of the robot through the appropriate emotional intensity. Conclusions In this study, we add the situations of the user and the robot to the standardized emotional expression of the robot and derive elements to express the intensity of the emotion. Through this, the robot can effectively express its emotions and intentions in various environments and can interact smoothly with humans. In addition, we provide suggestions for developing optimized robots using basic data so that robot manufacturers and robot experts can implement an emotional system that adds flexibility to robot behavior.
期刊介绍:
Archives of Design Research (ADR) is an international journal publishing original research in the field of design, including industrial design, visual communication design, interaction design, space design, and service design. It also invites research outcomes from design-related interdisciplinary fields such as the humanities, arts, technology, society and business. It is an open-access journal, publishing four issues per year. Currently papers are published in both English and Korean with an English abstract. ADR aims to build a strong foundation of knowledge in design through the introduction of basic, applied and clinical research. ADR serves as a venue and platform to archive and transfer fundamental design theories, methods, tools and cases. Research areas covered in the journal include: -Design Theory and its Methodology -Design Philosophy, Ethics, Values, and Issues -Design Education -Design Management and Strategy -Sustainability, Culture, History, and Societal Design -Human Behaviors, Perception, and Emotion -Semantics, Aesthetics and Experience in Design -Interaction and Interface Design -Design Tools and New Media -Universal Design/Inclusive Design -Design Creativity -Design Projects and Case Studies