Mochamad Susantok, Muhammad Diono, M. Saputra, Susi Rubiyati
{"title":"Android-based Touch Screen Projector Design Using a 3D Camera","authors":"Mochamad Susantok, Muhammad Diono, M. Saputra, Susi Rubiyati","doi":"10.1109/ICon-EEI.2018.8784308","DOIUrl":null,"url":null,"abstract":"Android users are growing every year and make applications to develop base on human need like they work, play, and learn. This requires a new way of interaction with the Android application. The hand gesture to control application has been introduced last year, this is interesting for this research to introduce how to control android accessibility using a hand gesture. It is using the 3d camera as a sensor to capture hand gesture and use it to control menu in Android. Android accessibility like a move to item next and backward, choose an item, zoom in-out and back to the main menu. Thus, this research can help Android user to present android content to the audience by interacting directly on the screen. The result for a distance between the 3d camera and projector screen is 360 cm. Screen area tested is different for each corner of the area or field. There is field 1 = 70cm, field 2 = 85cm, and field 3 = 100cm. The field 2 and 3 have accuracy average above 75% with height range 150–172cm of presenter. Field 1 has better accuracy for certain movement because of needed in specific height. Average response time delay for each command control is below 374.3 millisecond.","PeriodicalId":114952,"journal":{"name":"2018 2nd International Conference on Electrical Engineering and Informatics (ICon EEI)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 2nd International Conference on Electrical Engineering and Informatics (ICon EEI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICon-EEI.2018.8784308","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Android users are growing every year and make applications to develop base on human need like they work, play, and learn. This requires a new way of interaction with the Android application. The hand gesture to control application has been introduced last year, this is interesting for this research to introduce how to control android accessibility using a hand gesture. It is using the 3d camera as a sensor to capture hand gesture and use it to control menu in Android. Android accessibility like a move to item next and backward, choose an item, zoom in-out and back to the main menu. Thus, this research can help Android user to present android content to the audience by interacting directly on the screen. The result for a distance between the 3d camera and projector screen is 360 cm. Screen area tested is different for each corner of the area or field. There is field 1 = 70cm, field 2 = 85cm, and field 3 = 100cm. The field 2 and 3 have accuracy average above 75% with height range 150–172cm of presenter. Field 1 has better accuracy for certain movement because of needed in specific height. Average response time delay for each command control is below 374.3 millisecond.