{"title":"Touch-sensitive interactive projection system","authors":"Ming He, Jun Cheng, Dapeng Tao","doi":"10.1109/SPAC.2014.6982729","DOIUrl":null,"url":null,"abstract":"In this paper, we present a vision-based humancomputer interaction system merely consisting of a projector and a single camera, which is no longer limited to traditional displaying but allowing users to touch on any projected surfaces for interaction. The challenge of bare-hand touch detection in projector-camera system is to recover the depth from user's fingertip to projector with monocular vision. A novel approach is proposed to detect touch action through locally feature from accelerated segment test (FAST) matching between captured image and projected image. By comparing the hamming distance of these features with binary robust invariant scalable keypoint (BRISK), the 3D information near fingertips is able to be probed like deciding if there is a finger touching on table surface. Extensive experiments about hand region segmentation and touch detection are presented to show the robust performance of our system.","PeriodicalId":326246,"journal":{"name":"Proceedings 2014 IEEE International Conference on Security, Pattern Analysis, and Cybernetics (SPAC)","volume":"10 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 2014 IEEE International Conference on Security, Pattern Analysis, and Cybernetics (SPAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPAC.2014.6982729","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In this paper, we present a vision-based humancomputer interaction system merely consisting of a projector and a single camera, which is no longer limited to traditional displaying but allowing users to touch on any projected surfaces for interaction. The challenge of bare-hand touch detection in projector-camera system is to recover the depth from user's fingertip to projector with monocular vision. A novel approach is proposed to detect touch action through locally feature from accelerated segment test (FAST) matching between captured image and projected image. By comparing the hamming distance of these features with binary robust invariant scalable keypoint (BRISK), the 3D information near fingertips is able to be probed like deciding if there is a finger touching on table surface. Extensive experiments about hand region segmentation and touch detection are presented to show the robust performance of our system.