{"title":"AirDraw: Leveraging smart watch motion sensors for mobile human computer interactions","authors":"Danial Moazen, Seyed Sajjadi, A. Nahapetian","doi":"10.1109/CCNC.2016.7444820","DOIUrl":null,"url":null,"abstract":"Wearable computing is one of the fastest growing technology markets today, with smart watches poised to take over at least of half the wearable device market. Approaches to text entry on smart watches and other wrist worn systems, independent of the small screen, is of importance to the further growth of wearable systems. The consistent user interaction and hands-free, heads-up operation of smart watches paves the way for gesture recognition methods for text entry. This paper proposes a new text input method for smart watches, which utilizes motion sensor data and machine learning approaches to detect letters written in the air by a user. This method is less computationally intensive, less expensive, and unaffected by lighting factors, when compared to computer vision approaches. The AirDraw system prototype developed to test this approach is presented, along with experimental results with close to 71% accuracy in letter recognition.","PeriodicalId":399247,"journal":{"name":"2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"35","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCNC.2016.7444820","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 35
Abstract
Wearable computing is one of the fastest growing technology markets today, with smart watches poised to take over at least of half the wearable device market. Approaches to text entry on smart watches and other wrist worn systems, independent of the small screen, is of importance to the further growth of wearable systems. The consistent user interaction and hands-free, heads-up operation of smart watches paves the way for gesture recognition methods for text entry. This paper proposes a new text input method for smart watches, which utilizes motion sensor data and machine learning approaches to detect letters written in the air by a user. This method is less computationally intensive, less expensive, and unaffected by lighting factors, when compared to computer vision approaches. The AirDraw system prototype developed to test this approach is presented, along with experimental results with close to 71% accuracy in letter recognition.