{"title":"EXController","authors":"Junjian Zhang, Yaohao Chen, Satoshi Hashizume, Naoya Muramatsu, Kotaro Omomo, Riku Iwasaki, Kaji Wataru, Yoichi Ochiai","doi":"10.1145/3281505.3283385","DOIUrl":null,"url":null,"abstract":"This paper presents EXController, a new controller-mounted finger posture recognition device specially designed for VR handheld controllers. We seek to provide additional input through real-time vision sensing by attaching a near infrared (NIR) camera onto the controller. We designed and implemented an exploratory prototype with a HTC Vive controller. The NIR camera is modified from a traditional webcam and applied with a data-driven Convolutional Neural Network (CNN) classifier. We designed 12 different finger gestures and trained the CNN classifier with a dataset from 20 subjects, achieving an average accuracy of 86.17% across - subjects, and, approximately more than 92% on three of the finger postures, and more than 89% on the top-4 accuracy postures. We also developed a Unity demo that shows matched finger animations, running at approximately 27 fps in real-time.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"EXController\",\"authors\":\"Junjian Zhang, Yaohao Chen, Satoshi Hashizume, Naoya Muramatsu, Kotaro Omomo, Riku Iwasaki, Kaji Wataru, Yoichi Ochiai\",\"doi\":\"10.1145/3281505.3283385\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents EXController, a new controller-mounted finger posture recognition device specially designed for VR handheld controllers. We seek to provide additional input through real-time vision sensing by attaching a near infrared (NIR) camera onto the controller. We designed and implemented an exploratory prototype with a HTC Vive controller. The NIR camera is modified from a traditional webcam and applied with a data-driven Convolutional Neural Network (CNN) classifier. We designed 12 different finger gestures and trained the CNN classifier with a dataset from 20 subjects, achieving an average accuracy of 86.17% across - subjects, and, approximately more than 92% on three of the finger postures, and more than 89% on the top-4 accuracy postures. We also developed a Unity demo that shows matched finger animations, running at approximately 27 fps in real-time.\",\"PeriodicalId\":138249,\"journal\":{\"name\":\"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3281505.3283385\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3281505.3283385","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
This paper presents EXController, a new controller-mounted finger posture recognition device specially designed for VR handheld controllers. We seek to provide additional input through real-time vision sensing by attaching a near infrared (NIR) camera onto the controller. We designed and implemented an exploratory prototype with a HTC Vive controller. The NIR camera is modified from a traditional webcam and applied with a data-driven Convolutional Neural Network (CNN) classifier. We designed 12 different finger gestures and trained the CNN classifier with a dataset from 20 subjects, achieving an average accuracy of 86.17% across - subjects, and, approximately more than 92% on three of the finger postures, and more than 89% on the top-4 accuracy postures. We also developed a Unity demo that shows matched finger animations, running at approximately 27 fps in real-time.