Ahmed Hamza, Santosh Kumar Prabhulingaiah, Pegah Pezeshkpour, Bastian E. Rapp
{"title":"Hand Gesture Recognition Using Frequency-Modulated Continuous Wave Radar on Tactile Displays for the Visually Impaired","authors":"Ahmed Hamza, Santosh Kumar Prabhulingaiah, Pegah Pezeshkpour, Bastian E. Rapp","doi":"10.1002/aisy.202400663","DOIUrl":null,"url":null,"abstract":"<p>Touchscreens are essential parts of many electronics in daily lives of sighted people in the digital information era. On the other hand, visually impaired users rely on tactile displays as one of the key communication devices to interact with the digital world. However, due to their working mechanism and the uneven surface of tactile displays, one of the key features of screens for sighted users is surprisingly challenging to implement: precision touch input. To overcome this, a hand gesture recognition system is developed using a frequency-modulated continuous wave millimeter-wave radar. A multifeature encoder method is used to obtain the range and velocity information from the radar to translate the data into spectrogram images. Gesture recognition is implemented for common input gestures: single/double-click, swipe-right/left, scroll-up/down, zoom-in/out, and rotate-anticlockwise/clockwise. The gesture recognition and classification are based on machine learning, support vector machines, deep learning, and convolutional neural network approaches. The chosen model You-Only-Look-Once (YOLOv8) shows a high accuracy of 97.1% by iterating only 30 epochs with only 500 collected data samples per gesture. This research paves the way toward using radar sensors not only for tactile displays but also for other digital devices in human–computer interaction.</p>","PeriodicalId":93858,"journal":{"name":"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)","volume":"7 2","pages":""},"PeriodicalIF":6.8000,"publicationDate":"2024-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/aisy.202400663","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/aisy.202400663","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Touchscreens are essential parts of many electronics in daily lives of sighted people in the digital information era. On the other hand, visually impaired users rely on tactile displays as one of the key communication devices to interact with the digital world. However, due to their working mechanism and the uneven surface of tactile displays, one of the key features of screens for sighted users is surprisingly challenging to implement: precision touch input. To overcome this, a hand gesture recognition system is developed using a frequency-modulated continuous wave millimeter-wave radar. A multifeature encoder method is used to obtain the range and velocity information from the radar to translate the data into spectrogram images. Gesture recognition is implemented for common input gestures: single/double-click, swipe-right/left, scroll-up/down, zoom-in/out, and rotate-anticlockwise/clockwise. The gesture recognition and classification are based on machine learning, support vector machines, deep learning, and convolutional neural network approaches. The chosen model You-Only-Look-Once (YOLOv8) shows a high accuracy of 97.1% by iterating only 30 epochs with only 500 collected data samples per gesture. This research paves the way toward using radar sensors not only for tactile displays but also for other digital devices in human–computer interaction.