Indika B. Wijayasinghe, M. Saadatzi, Srikanth Peetha, D. Popa, Sven Cremer
{"title":"Adaptive Interface for Robot Teleoperation using a Genetic Algorithm","authors":"Indika B. Wijayasinghe, M. Saadatzi, Srikanth Peetha, D. Popa, Sven Cremer","doi":"10.1109/COASE.2018.8560466","DOIUrl":null,"url":null,"abstract":"The design of User Interfaces (UI) is a vital part of Human Machine Interaction (HMI), which affects the performance during collaboration or teleoperation. Ideally, UIs should be intuitive and easy to learn, but their design is challenging especially for complex tasks involving robots with many degrees of freedom. In this paper, we pose the UI design problem as a mapping between an interface device with M input degrees of freedom that generates commands for driving a robot with N output degrees of freedom. We describe a novel adaptive scheme that can learn the N to M input-output map, such that certain task-related performance measures are maximized. The resulting “Genetic Adaptive User Interface” (GAUI), is formulated and utilized to minimize a cost function related to the user teleoperation performance. This algorithm is an unsupervised learning scheme that does not require any knowledge about the robot, the user, or the environment. To validate our approach, we provide simulation and experimental results with a non-holonomic robot and two control interfaces; a joystick and a Myo gesture control armband. Results demonstrate that the adaptively trained map closely mimics the intuitive commands from the joystick interface, and also learns an easily controllable interface with the unintuitive gesture control armband. Abstract formulation of the method allows for easy modifications to the performance measure and application to other HMI tasks.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"46 1","pages":"50-56"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COASE.2018.8560466","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
The design of User Interfaces (UI) is a vital part of Human Machine Interaction (HMI), which affects the performance during collaboration or teleoperation. Ideally, UIs should be intuitive and easy to learn, but their design is challenging especially for complex tasks involving robots with many degrees of freedom. In this paper, we pose the UI design problem as a mapping between an interface device with M input degrees of freedom that generates commands for driving a robot with N output degrees of freedom. We describe a novel adaptive scheme that can learn the N to M input-output map, such that certain task-related performance measures are maximized. The resulting “Genetic Adaptive User Interface” (GAUI), is formulated and utilized to minimize a cost function related to the user teleoperation performance. This algorithm is an unsupervised learning scheme that does not require any knowledge about the robot, the user, or the environment. To validate our approach, we provide simulation and experimental results with a non-holonomic robot and two control interfaces; a joystick and a Myo gesture control armband. Results demonstrate that the adaptively trained map closely mimics the intuitive commands from the joystick interface, and also learns an easily controllable interface with the unintuitive gesture control armband. Abstract formulation of the method allows for easy modifications to the performance measure and application to other HMI tasks.