Nicolai Kern, Julian Aguilar, Pirmin Schoeder, C. Waldschmidt
{"title":"Improving the Robustness of Automotive Gesture Recognition by Diversified Simulation Datasets","authors":"Nicolai Kern, Julian Aguilar, Pirmin Schoeder, C. Waldschmidt","doi":"10.1109/RadarConf2351548.2023.10149625","DOIUrl":null,"url":null,"abstract":"A key element for the interaction between pedestrians and autonomous vehicles is the automated recognition of traffic and communication gestures. Gestures help vehicles to resolve critical or ambiguous situations. Detecting gestures with radar sensors is advantageous with respect to environmental conditions and lighting. However, the collection of a radar dataset that covers the wide range of variations in automotive scenarios comes at high cost and effort. On the other side, datasets with limited variations lead to reduced recognition accuracy or even complete failure in new scenarios. Hence, this paper analyzes the impact that deficiencies of traffic gesture datasets can have on the accuracy and investigates mitigation strategies based on the augmentation by simulated, variation-rich radar data. It is shown that by augmentation the robustness of a convolutional neural network (CNN)-based classifier against variations not covered by the training data is significantly improved. As a key result, both complete failure of the classifier and strongly decreased classification accuracy are avoided.","PeriodicalId":168311,"journal":{"name":"2023 IEEE Radar Conference (RadarConf23)","volume":"195 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Radar Conference (RadarConf23)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RadarConf2351548.2023.10149625","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A key element for the interaction between pedestrians and autonomous vehicles is the automated recognition of traffic and communication gestures. Gestures help vehicles to resolve critical or ambiguous situations. Detecting gestures with radar sensors is advantageous with respect to environmental conditions and lighting. However, the collection of a radar dataset that covers the wide range of variations in automotive scenarios comes at high cost and effort. On the other side, datasets with limited variations lead to reduced recognition accuracy or even complete failure in new scenarios. Hence, this paper analyzes the impact that deficiencies of traffic gesture datasets can have on the accuracy and investigates mitigation strategies based on the augmentation by simulated, variation-rich radar data. It is shown that by augmentation the robustness of a convolutional neural network (CNN)-based classifier against variations not covered by the training data is significantly improved. As a key result, both complete failure of the classifier and strongly decreased classification accuracy are avoided.