{"title":"Mentor机器人控制的手势界面设计","authors":"M. Dehbi, Z. Ahmed Foitih","doi":"10.1109/ICOSC.2013.6750976","DOIUrl":null,"url":null,"abstract":"In the field of Man-Machine interaction, gestural communication is expected to play a more and more important role due to its direct, natural character and its many potential uses. Our work has been chosen in the same context as robots control through gestural interfaces specifically using hand gesture. This article presents the different steps in the design of such a gestural servoing system, starting with hand gesture recognition through its interpretation in order to manipulate the Mentor robot (virtual and real), in real time. This gesture recognition system encompasses gesture acquisition, segmentation and identification using principal component analysis. Once the gesture has been recognized, it is operated to control the robot using two techniques: articular command and operational command.","PeriodicalId":199135,"journal":{"name":"3rd International Conference on Systems and Control","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Designing a gestural interface for Mentor robot control\",\"authors\":\"M. Dehbi, Z. Ahmed Foitih\",\"doi\":\"10.1109/ICOSC.2013.6750976\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the field of Man-Machine interaction, gestural communication is expected to play a more and more important role due to its direct, natural character and its many potential uses. Our work has been chosen in the same context as robots control through gestural interfaces specifically using hand gesture. This article presents the different steps in the design of such a gestural servoing system, starting with hand gesture recognition through its interpretation in order to manipulate the Mentor robot (virtual and real), in real time. This gesture recognition system encompasses gesture acquisition, segmentation and identification using principal component analysis. Once the gesture has been recognized, it is operated to control the robot using two techniques: articular command and operational command.\",\"PeriodicalId\":199135,\"journal\":{\"name\":\"3rd International Conference on Systems and Control\",\"volume\":\"36 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"3rd International Conference on Systems and Control\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICOSC.2013.6750976\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"3rd International Conference on Systems and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOSC.2013.6750976","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Designing a gestural interface for Mentor robot control
In the field of Man-Machine interaction, gestural communication is expected to play a more and more important role due to its direct, natural character and its many potential uses. Our work has been chosen in the same context as robots control through gestural interfaces specifically using hand gesture. This article presents the different steps in the design of such a gestural servoing system, starting with hand gesture recognition through its interpretation in order to manipulate the Mentor robot (virtual and real), in real time. This gesture recognition system encompasses gesture acquisition, segmentation and identification using principal component analysis. Once the gesture has been recognized, it is operated to control the robot using two techniques: articular command and operational command.