Tingpei Huang;Haotian Wang;Rongyu Gao;Jianhang Liu;Shibao Li
{"title":"利用毫米波雷达进行谱图生成和手势识别的可分类框架","authors":"Tingpei Huang;Haotian Wang;Rongyu Gao;Jianhang Liu;Shibao Li","doi":"10.1109/JSEN.2024.3472065","DOIUrl":null,"url":null,"abstract":"In gesture recognition based on millimeter-wave radar, generating spectrograms is typically independent of the actual application and designed separately. In this case, the task is simply decoupled, resulting in the generated spectrograms from radar signals not being optimally suited for the recognition task. Additionally, the emergence of gesture categories representing new semantics requires the recollection of a large amount of high-quality labeled data and retraining of the model. To address these problems, we propose a radar-based category-scalable gesture recognition framework, R-CSGR, for gesture spectrogram generation and two-stage gesture recognition. Considering the noise and environmental factors, only gesture-related signals are extracted and aggregated in the Doppler and angle dimensions to form a location-independent, information-dense gesture spectrogram for the two-stage recognition. In the first stage, the reconstruction of spectrogram for the original categories is used as a self-supervised learning task to utilize low-cost unlabeled data. In the second stage, the classification layer based on the cosine nearest-centroid method is used to quickly recognize new gesture categories whereas maintaining the recognition capability of the original categories. The result shows that with the introduction of five new gesture categories and only eight shots per category in the support set, an average recognition accuracy of 96.88% is achieved for all nine gesture categories.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"24 22","pages":"38479-38491"},"PeriodicalIF":4.3000,"publicationDate":"2024-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Category-Scalable Framework Using Millimeter-Wave Radar for Spectrogram Generation and Gesture Recognition\",\"authors\":\"Tingpei Huang;Haotian Wang;Rongyu Gao;Jianhang Liu;Shibao Li\",\"doi\":\"10.1109/JSEN.2024.3472065\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In gesture recognition based on millimeter-wave radar, generating spectrograms is typically independent of the actual application and designed separately. In this case, the task is simply decoupled, resulting in the generated spectrograms from radar signals not being optimally suited for the recognition task. Additionally, the emergence of gesture categories representing new semantics requires the recollection of a large amount of high-quality labeled data and retraining of the model. To address these problems, we propose a radar-based category-scalable gesture recognition framework, R-CSGR, for gesture spectrogram generation and two-stage gesture recognition. Considering the noise and environmental factors, only gesture-related signals are extracted and aggregated in the Doppler and angle dimensions to form a location-independent, information-dense gesture spectrogram for the two-stage recognition. In the first stage, the reconstruction of spectrogram for the original categories is used as a self-supervised learning task to utilize low-cost unlabeled data. In the second stage, the classification layer based on the cosine nearest-centroid method is used to quickly recognize new gesture categories whereas maintaining the recognition capability of the original categories. The result shows that with the introduction of five new gesture categories and only eight shots per category in the support set, an average recognition accuracy of 96.88% is achieved for all nine gesture categories.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"24 22\",\"pages\":\"38479-38491\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2024-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10713069/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10713069/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
A Category-Scalable Framework Using Millimeter-Wave Radar for Spectrogram Generation and Gesture Recognition
In gesture recognition based on millimeter-wave radar, generating spectrograms is typically independent of the actual application and designed separately. In this case, the task is simply decoupled, resulting in the generated spectrograms from radar signals not being optimally suited for the recognition task. Additionally, the emergence of gesture categories representing new semantics requires the recollection of a large amount of high-quality labeled data and retraining of the model. To address these problems, we propose a radar-based category-scalable gesture recognition framework, R-CSGR, for gesture spectrogram generation and two-stage gesture recognition. Considering the noise and environmental factors, only gesture-related signals are extracted and aggregated in the Doppler and angle dimensions to form a location-independent, information-dense gesture spectrogram for the two-stage recognition. In the first stage, the reconstruction of spectrogram for the original categories is used as a self-supervised learning task to utilize low-cost unlabeled data. In the second stage, the classification layer based on the cosine nearest-centroid method is used to quickly recognize new gesture categories whereas maintaining the recognition capability of the original categories. The result shows that with the introduction of five new gesture categories and only eight shots per category in the support set, an average recognition accuracy of 96.88% is achieved for all nine gesture categories.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice