{"title":"Hand Talk-Implementation of a Gesture Recognizing Glove","authors":"Celestine Preetham, Ganesh Ramakrishnan, Sujan Kumar Gonugondla, Anish Tamse, N. Krishnapura","doi":"10.1109/TIIEC.2013.65","DOIUrl":null,"url":null,"abstract":"We present our prototype for a gesture recognizing glove (data glove). We use low cost packaging material (velostat) for making piezoresistive sensors. These flex sensors detect a bend in fingers and we map this data to a character set by implementing a Minimum Mean Square Error machine learning algorithm. The recognized character is transmitted via Bluetooth, to an Android phone, which performs a text to speech conversion. Our motivation for Hand Talk is to compare hand configurations with sign language charts and generate artificial speech which articulates the gestured words. This technology also has further applications as a 3D mouse, virtual keyboard, control for precision control of robotic arms.","PeriodicalId":250687,"journal":{"name":"2013 Texas Instruments India Educators' Conference","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"55","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 Texas Instruments India Educators' Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TIIEC.2013.65","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 55
Abstract
We present our prototype for a gesture recognizing glove (data glove). We use low cost packaging material (velostat) for making piezoresistive sensors. These flex sensors detect a bend in fingers and we map this data to a character set by implementing a Minimum Mean Square Error machine learning algorithm. The recognized character is transmitted via Bluetooth, to an Android phone, which performs a text to speech conversion. Our motivation for Hand Talk is to compare hand configurations with sign language charts and generate artificial speech which articulates the gestured words. This technology also has further applications as a 3D mouse, virtual keyboard, control for precision control of robotic arms.