Weiyong Si, Tianjian Zhong, Ning Wang, Chenguang Yang
{"title":"A multimodal teleoperation interface for human-robot collaboration","authors":"Weiyong Si, Tianjian Zhong, Ning Wang, Chenguang Yang","doi":"10.1109/ICM54990.2023.10102060","DOIUrl":null,"url":null,"abstract":"Human-robot collaboration provides an effective approach to combine human intelligence and the autonomy of robots, which can improve the safety and efficiency of the robot. However, developing an intuitive and immersive human-robot interface with multimodal feedback for human-robot interaction and collaboration is still challenging. In this paper, we developed a multimodal-based human-robot interface to involve humans in the loop. The Unity-based virtual reality (VR) environment, including the virtual robot manipulator and its working environment, was developed to simulate the real working environment of robots. We integrated the digital twin mechanism with the VR environment development, which provides a corresponding model with the physical task. The virtual environment could visualize the visual and haptic feedback through the multimodal sensors in the robot, which provides an immersive and friendly teleoperating environment for human operators. We conduct user study experiments based on NASA Task Load Index, through a physical contact scanning task. The result shows that the proposed multimodal interface improved by 31.8% in terms of the cognitive and physical workload, comparing with the commercial teleportation device Touch X.","PeriodicalId":416176,"journal":{"name":"2023 IEEE International Conference on Mechatronics (ICM)","volume":"172 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Mechatronics (ICM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICM54990.2023.10102060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Human-robot collaboration provides an effective approach to combine human intelligence and the autonomy of robots, which can improve the safety and efficiency of the robot. However, developing an intuitive and immersive human-robot interface with multimodal feedback for human-robot interaction and collaboration is still challenging. In this paper, we developed a multimodal-based human-robot interface to involve humans in the loop. The Unity-based virtual reality (VR) environment, including the virtual robot manipulator and its working environment, was developed to simulate the real working environment of robots. We integrated the digital twin mechanism with the VR environment development, which provides a corresponding model with the physical task. The virtual environment could visualize the visual and haptic feedback through the multimodal sensors in the robot, which provides an immersive and friendly teleoperating environment for human operators. We conduct user study experiments based on NASA Task Load Index, through a physical contact scanning task. The result shows that the proposed multimodal interface improved by 31.8% in terms of the cognitive and physical workload, comparing with the commercial teleportation device Touch X.