{"title":"Proposal of eye-gaze recognize method for input interface without infra-red ray equipment","authors":"K. Fukushima, N. Shirahama","doi":"10.1109/SNPD.2014.6888742","DOIUrl":null,"url":null,"abstract":"The main purpose of this study is to develop the inexpensive eye-gaze input interface for disabled people. The eye-gaze inputting can suit many situations, and it has little load for user because it is non-contact input interface. Many projects about the eye-gaze have been studied recently. Most of the productions of eye-gaze inputting use infra-red rays (IR) to detect the iris in the eyes. However the harm of IR for human eyes has been pointed out. In addition, if the interface requires the camera which has IR, the users must purchase the specific devices. Therefore we adopted a PC and the camera doesn't have the IR function. We propose the system by using the motion-template that is one of the functions of OpenCV library for tracking motion. However this function can't recognize the point where the user watches on monitor. It can recognize only the motion. That's why we made calibrate function to relate the eye-gaze with the monitor. We propose two methods to recognize eye-gaze. Both methods require the iris binary image. This image shows the iris shape and we expect that user's visual point can be calculated from this iris image. One method uses gravity point to calculate the point. Other method uses the rectangular approximation to calculate it. We did experiments for some subjects by both methods and compared the results to validate which method is proper and how much the calibrate function is accurate. In the experiment, the function randomly spotted a blue target on the monitor. The target position changes on a regular basis. In this experiment, the user stares at the target and we check the accuracy. If this function or methods are proper, the function correctly recognizes the user stare at target or near. For more accuracy, we will consider about how to detect the iris correctly in the future.","PeriodicalId":272932,"journal":{"name":"15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SNPD.2014.6888742","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The main purpose of this study is to develop the inexpensive eye-gaze input interface for disabled people. The eye-gaze inputting can suit many situations, and it has little load for user because it is non-contact input interface. Many projects about the eye-gaze have been studied recently. Most of the productions of eye-gaze inputting use infra-red rays (IR) to detect the iris in the eyes. However the harm of IR for human eyes has been pointed out. In addition, if the interface requires the camera which has IR, the users must purchase the specific devices. Therefore we adopted a PC and the camera doesn't have the IR function. We propose the system by using the motion-template that is one of the functions of OpenCV library for tracking motion. However this function can't recognize the point where the user watches on monitor. It can recognize only the motion. That's why we made calibrate function to relate the eye-gaze with the monitor. We propose two methods to recognize eye-gaze. Both methods require the iris binary image. This image shows the iris shape and we expect that user's visual point can be calculated from this iris image. One method uses gravity point to calculate the point. Other method uses the rectangular approximation to calculate it. We did experiments for some subjects by both methods and compared the results to validate which method is proper and how much the calibrate function is accurate. In the experiment, the function randomly spotted a blue target on the monitor. The target position changes on a regular basis. In this experiment, the user stares at the target and we check the accuracy. If this function or methods are proper, the function correctly recognizes the user stare at target or near. For more accuracy, we will consider about how to detect the iris correctly in the future.