{"title":"Towards safe motion planning for industrial human-robot interaction: A co-evolution approach based on human digital twin and mixed reality","authors":"Bohan Feng, Zeqing Wang, Lianjie Yuan, Qi Zhou, Yulin Chen, Youyi Bi","doi":"10.1016/j.rcim.2025.103012","DOIUrl":null,"url":null,"abstract":"<div><div>Advanced human-robot interaction (HRI) is essential for the next-generation human-centric manufacturing mode such as “Industry 5.0”. Despite recent mutual cognitive approaches can enhance the understanding and collaboration between humans and robots, these methods often rely on predefined rules and are limited in adapting to new tasks or changes of the working environment. These limitations can hinder the popularization of collaborative robots in dynamic manufacturing environments, where tasks can be highly variable, and unforeseen operational changes frequently occur. To address these challenges, we propose a co-evolution approach for the safe motion planning of industrial human-robot interaction. The core idea is to promote the evolution of human worker’s safe operation cognition as well as the evolution of robot’s safe motion planning strategy in a unified and continuous framework by leveraging human digital twin (HDT) and mixed reality (MR) technologies. Specifically, HDT captures real-time human behaviors and postures, which enables robots to adapt dynamically to the changes of human behavior and environment. HDT also refines deep reinforcement learning (DRL)-based motion planning, allowing robots to continuously learn from human actions and update their motion strategies. On the other hand, MR superimposes rich information regarding the tasks and robot in the physical world, helping human workers better understand and adapt to robot’s actions. MR also provides intuitive gesture-based user interface, further improving the smoothness of human-robot interaction. We validate the proposed approach’s effectiveness with evaluations in realistic manufacturing scenarios, demonstrating its potential to advance HRI practice in the context of smart manufacturing.</div></div>","PeriodicalId":21452,"journal":{"name":"Robotics and Computer-integrated Manufacturing","volume":"95 ","pages":"Article 103012"},"PeriodicalIF":9.1000,"publicationDate":"2025-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Computer-integrated Manufacturing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0736584525000663","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Advanced human-robot interaction (HRI) is essential for the next-generation human-centric manufacturing mode such as “Industry 5.0”. Despite recent mutual cognitive approaches can enhance the understanding and collaboration between humans and robots, these methods often rely on predefined rules and are limited in adapting to new tasks or changes of the working environment. These limitations can hinder the popularization of collaborative robots in dynamic manufacturing environments, where tasks can be highly variable, and unforeseen operational changes frequently occur. To address these challenges, we propose a co-evolution approach for the safe motion planning of industrial human-robot interaction. The core idea is to promote the evolution of human worker’s safe operation cognition as well as the evolution of robot’s safe motion planning strategy in a unified and continuous framework by leveraging human digital twin (HDT) and mixed reality (MR) technologies. Specifically, HDT captures real-time human behaviors and postures, which enables robots to adapt dynamically to the changes of human behavior and environment. HDT also refines deep reinforcement learning (DRL)-based motion planning, allowing robots to continuously learn from human actions and update their motion strategies. On the other hand, MR superimposes rich information regarding the tasks and robot in the physical world, helping human workers better understand and adapt to robot’s actions. MR also provides intuitive gesture-based user interface, further improving the smoothness of human-robot interaction. We validate the proposed approach’s effectiveness with evaluations in realistic manufacturing scenarios, demonstrating its potential to advance HRI practice in the context of smart manufacturing.
期刊介绍:
The journal, Robotics and Computer-Integrated Manufacturing, focuses on sharing research applications that contribute to the development of new or enhanced robotics, manufacturing technologies, and innovative manufacturing strategies that are relevant to industry. Papers that combine theory and experimental validation are preferred, while review papers on current robotics and manufacturing issues are also considered. However, papers on traditional machining processes, modeling and simulation, supply chain management, and resource optimization are generally not within the scope of the journal, as there are more appropriate journals for these topics. Similarly, papers that are overly theoretical or mathematical will be directed to other suitable journals. The journal welcomes original papers in areas such as industrial robotics, human-robot collaboration in manufacturing, cloud-based manufacturing, cyber-physical production systems, big data analytics in manufacturing, smart mechatronics, machine learning, adaptive and sustainable manufacturing, and other fields involving unique manufacturing technologies.