Masaya Ogawa, K. Honda, Yoshihiro Sato, S. Kudoh, Takeshi Oishi, K. Ikeuchi
{"title":"基于任务模型的仿人机器人遥操作运动生成","authors":"Masaya Ogawa, K. Honda, Yoshihiro Sato, S. Kudoh, Takeshi Oishi, K. Ikeuchi","doi":"10.1109/ROMAN.2015.7333619","DOIUrl":null,"url":null,"abstract":"In recent years, the research of humanoid robots that replace human tasks in emergency situations have been widely studied. Currently, many approaches are automate dedicated hardware for each mission. But, at the environment where situation changes, operation by humanoid robot is effective to operate equipments which designed for human. Ultimately, automation is ideal, but under the present circumstances, teleoperation of humanoid robot is effective for corresponding changes of situation. An intuitive interface is required for effectively controlling the humanoid robot from a distant place. Recently, the interfaces that map the human motion to the humanoid robot have become popular because of the development of the motion recognition systems. However, the humanoid robot and human beings have different joint structure, physical ability and weight balance. It is not practical to map the motion directly. There is also the issue of time delay between the operator and the robot. Therefore, it is desirable that the operator performs global judgments and the robot runs semi-autonomously in the local environment. In this paper we propose a method to remotely operate the humanoid robot by the task model. Our method describes human behavior abstractly by the task model and mapped this abstract expressions to humanoid robots, and overcome difference of structure of body. In this work, we operate lever of buggy-type vehicles as a example of mapping using the task model.","PeriodicalId":119467,"journal":{"name":"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Motion generation of the humanoid robot for teleoperation by task model\",\"authors\":\"Masaya Ogawa, K. Honda, Yoshihiro Sato, S. Kudoh, Takeshi Oishi, K. Ikeuchi\",\"doi\":\"10.1109/ROMAN.2015.7333619\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, the research of humanoid robots that replace human tasks in emergency situations have been widely studied. Currently, many approaches are automate dedicated hardware for each mission. But, at the environment where situation changes, operation by humanoid robot is effective to operate equipments which designed for human. Ultimately, automation is ideal, but under the present circumstances, teleoperation of humanoid robot is effective for corresponding changes of situation. An intuitive interface is required for effectively controlling the humanoid robot from a distant place. Recently, the interfaces that map the human motion to the humanoid robot have become popular because of the development of the motion recognition systems. However, the humanoid robot and human beings have different joint structure, physical ability and weight balance. It is not practical to map the motion directly. There is also the issue of time delay between the operator and the robot. Therefore, it is desirable that the operator performs global judgments and the robot runs semi-autonomously in the local environment. In this paper we propose a method to remotely operate the humanoid robot by the task model. Our method describes human behavior abstractly by the task model and mapped this abstract expressions to humanoid robots, and overcome difference of structure of body. In this work, we operate lever of buggy-type vehicles as a example of mapping using the task model.\",\"PeriodicalId\":119467,\"journal\":{\"name\":\"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-11-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROMAN.2015.7333619\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.2015.7333619","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Motion generation of the humanoid robot for teleoperation by task model
In recent years, the research of humanoid robots that replace human tasks in emergency situations have been widely studied. Currently, many approaches are automate dedicated hardware for each mission. But, at the environment where situation changes, operation by humanoid robot is effective to operate equipments which designed for human. Ultimately, automation is ideal, but under the present circumstances, teleoperation of humanoid robot is effective for corresponding changes of situation. An intuitive interface is required for effectively controlling the humanoid robot from a distant place. Recently, the interfaces that map the human motion to the humanoid robot have become popular because of the development of the motion recognition systems. However, the humanoid robot and human beings have different joint structure, physical ability and weight balance. It is not practical to map the motion directly. There is also the issue of time delay between the operator and the robot. Therefore, it is desirable that the operator performs global judgments and the robot runs semi-autonomously in the local environment. In this paper we propose a method to remotely operate the humanoid robot by the task model. Our method describes human behavior abstractly by the task model and mapped this abstract expressions to humanoid robots, and overcome difference of structure of body. In this work, we operate lever of buggy-type vehicles as a example of mapping using the task model.