{"title":"面向非结构化环境下共享遥操作的交互式虚拟夹具生成框架","authors":"Vitalii Pruks, J. Ryu","doi":"10.1109/ICRA40945.2020.9196579","DOIUrl":null,"url":null,"abstract":"Virtual fixtures (VFs) improve human operator performance in teleoperation scenarios. However, the generation of VFs is challenging, especially in unstructured environments. In this work, we introduce a framework for the interactive generation of VF. The method is based on the observation that a human can easily understand just by looking at the remote environment which VF could help in task execution. We propose a user interface that detects features on camera images and permits interactive selection of the features. We demonstrate how the feature selection can be used for designing VF, providing 6-DOF haptic feedback. In order to make the proposed framework more generally applicable to a wider variety of applications, we formalize the process of virtual fixture generation (VFG) into the specification of features, geometric primitives, and constraints. The framework can be extended further by the introduction of additional components. Through the human subject study, we demonstrate the proposed framework is intuitive, easy to use while effective, especially for performing hard contact tasks.","PeriodicalId":6859,"journal":{"name":"2020 IEEE International Conference on Robotics and Automation (ICRA)","volume":"62 1","pages":"10234-10241"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"A Framework for Interactive Virtual Fixture Generation for Shared Teleoperation in Unstructured Environments\",\"authors\":\"Vitalii Pruks, J. Ryu\",\"doi\":\"10.1109/ICRA40945.2020.9196579\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Virtual fixtures (VFs) improve human operator performance in teleoperation scenarios. However, the generation of VFs is challenging, especially in unstructured environments. In this work, we introduce a framework for the interactive generation of VF. The method is based on the observation that a human can easily understand just by looking at the remote environment which VF could help in task execution. We propose a user interface that detects features on camera images and permits interactive selection of the features. We demonstrate how the feature selection can be used for designing VF, providing 6-DOF haptic feedback. In order to make the proposed framework more generally applicable to a wider variety of applications, we formalize the process of virtual fixture generation (VFG) into the specification of features, geometric primitives, and constraints. The framework can be extended further by the introduction of additional components. Through the human subject study, we demonstrate the proposed framework is intuitive, easy to use while effective, especially for performing hard contact tasks.\",\"PeriodicalId\":6859,\"journal\":{\"name\":\"2020 IEEE International Conference on Robotics and Automation (ICRA)\",\"volume\":\"62 1\",\"pages\":\"10234-10241\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Conference on Robotics and Automation (ICRA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICRA40945.2020.9196579\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Robotics and Automation (ICRA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRA40945.2020.9196579","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Framework for Interactive Virtual Fixture Generation for Shared Teleoperation in Unstructured Environments
Virtual fixtures (VFs) improve human operator performance in teleoperation scenarios. However, the generation of VFs is challenging, especially in unstructured environments. In this work, we introduce a framework for the interactive generation of VF. The method is based on the observation that a human can easily understand just by looking at the remote environment which VF could help in task execution. We propose a user interface that detects features on camera images and permits interactive selection of the features. We demonstrate how the feature selection can be used for designing VF, providing 6-DOF haptic feedback. In order to make the proposed framework more generally applicable to a wider variety of applications, we formalize the process of virtual fixture generation (VFG) into the specification of features, geometric primitives, and constraints. The framework can be extended further by the introduction of additional components. Through the human subject study, we demonstrate the proposed framework is intuitive, easy to use while effective, especially for performing hard contact tasks.