M. Haseeb, Maria Kyrarini, Shuo Jiang, Danijela Ristić-Durrant, A. Gräser
{"title":"Head Gesture-based Control for Assistive Robots","authors":"M. Haseeb, Maria Kyrarini, Shuo Jiang, Danijela Ristić-Durrant, A. Gräser","doi":"10.1145/3197768.3201574","DOIUrl":null,"url":null,"abstract":"Development of assistive robotics to enable people with disabilities to work is a challenging topic. Hands-free interfaces can support a person with severe motor impairments to control robotic manipulators. The paper focuses on developing a head gesture interface, which enables the end-user to control a dual-arm industrial robot. A motion sensor is located on the head of the end-user. Support vector machine is used to recognize the head gestures and an intuitive Graphical User Interface is developed to help the user to navigate through different control modes. To evaluate the proposed framework, an industrial pick and place task was performed.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"90 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3197768.3201574","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13
Abstract
Development of assistive robotics to enable people with disabilities to work is a challenging topic. Hands-free interfaces can support a person with severe motor impairments to control robotic manipulators. The paper focuses on developing a head gesture interface, which enables the end-user to control a dual-arm industrial robot. A motion sensor is located on the head of the end-user. Support vector machine is used to recognize the head gestures and an intuitive Graphical User Interface is developed to help the user to navigate through different control modes. To evaluate the proposed framework, an industrial pick and place task was performed.