{"title":"An Interactive System to Control a Humanoid Robot using Vision and Voice","authors":"Lee Yi Yong, S. Gobee, V. Durairajah","doi":"10.1109/I-SMAC55078.2022.9987307","DOIUrl":null,"url":null,"abstract":"Human-Robot Interaction (HRI) can improve a system effectiveness if implemented properly. This project presents an HRI interactive system to control a humanoid robot using vision and voice. The proposed system is aimed to ease the difficulty of controlling a robot as well as create an effective vision and voice system. The vision system is implemented in the form of a color-based object tracking system on the robot head while the voice-controlled system is implemented in the form of limb movement control through voice commands. As a result, they achieve an average accuracy of 84% and 84.29% respectively. The robot head and limb movement also achieve a maximum average error of 2° and 2.11° only. Finally, the voice-controlled system has an average response time of 1.73s. Possible future enhancements include considering other feature in the object tracking system such as texture and noise filtering on the voice recognition to improve their accuracy.","PeriodicalId":306129,"journal":{"name":"2022 Sixth International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Sixth International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/I-SMAC55078.2022.9987307","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Human-Robot Interaction (HRI) can improve a system effectiveness if implemented properly. This project presents an HRI interactive system to control a humanoid robot using vision and voice. The proposed system is aimed to ease the difficulty of controlling a robot as well as create an effective vision and voice system. The vision system is implemented in the form of a color-based object tracking system on the robot head while the voice-controlled system is implemented in the form of limb movement control through voice commands. As a result, they achieve an average accuracy of 84% and 84.29% respectively. The robot head and limb movement also achieve a maximum average error of 2° and 2.11° only. Finally, the voice-controlled system has an average response time of 1.73s. Possible future enhancements include considering other feature in the object tracking system such as texture and noise filtering on the voice recognition to improve their accuracy.