Shuai Wang, Luoyu Mei, Zhimeng Yin, Hao Li, Ruofeng Liu, Wenchao Jiang, Chris Xiaoxuan Lu
{"title":"基于毫米波雷达和视觉融合的自动驾驶车辆端到端目标活动检测","authors":"Shuai Wang, Luoyu Mei, Zhimeng Yin, Hao Li, Ruofeng Liu, Wenchao Jiang, Chris Xiaoxuan Lu","doi":"10.1145/3628453","DOIUrl":null,"url":null,"abstract":"The successful operation of autonomous vehicles hinges on their ability to accurately identify objects in their vicinity, particularly living targets such as bikers and pedestrians. However, visual interference inherent in real-world environments, such as omnipresent billboards, poses substantial challenges to extant vision-based detection technologies. These visual interference exhibit similar visual attributes to living targets, leading to erroneous identification. We address this problem by harnessing the capabilities of mmWave radar, a vital sensor in autonomous vehicles, in combination with vision technology, thereby contributing a unique solution for liveness target detection. We propose a methodology that extracts features from the mmWave radar signal to achieve end-to-end liveness target detection by integrating the mmWave radar and vision technology. This proposed methodology is implemented and evaluated on the commodity mmWave radar IWR6843ISK-ODS and vision sensor Logitech camera. Our extensive evaluation reveals that the proposed method accomplishes liveness target detection with a mean average precision (mAP) of 98.1%, surpassing the performance of existing studies.","PeriodicalId":50910,"journal":{"name":"ACM Transactions on Sensor Networks","volume":"243 1","pages":"0"},"PeriodicalIF":3.9000,"publicationDate":"2023-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"End-to-End Target Liveness Detection via mmWave Radar and Vision Fusion for Autonomous Vehicles\",\"authors\":\"Shuai Wang, Luoyu Mei, Zhimeng Yin, Hao Li, Ruofeng Liu, Wenchao Jiang, Chris Xiaoxuan Lu\",\"doi\":\"10.1145/3628453\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The successful operation of autonomous vehicles hinges on their ability to accurately identify objects in their vicinity, particularly living targets such as bikers and pedestrians. However, visual interference inherent in real-world environments, such as omnipresent billboards, poses substantial challenges to extant vision-based detection technologies. These visual interference exhibit similar visual attributes to living targets, leading to erroneous identification. We address this problem by harnessing the capabilities of mmWave radar, a vital sensor in autonomous vehicles, in combination with vision technology, thereby contributing a unique solution for liveness target detection. We propose a methodology that extracts features from the mmWave radar signal to achieve end-to-end liveness target detection by integrating the mmWave radar and vision technology. This proposed methodology is implemented and evaluated on the commodity mmWave radar IWR6843ISK-ODS and vision sensor Logitech camera. Our extensive evaluation reveals that the proposed method accomplishes liveness target detection with a mean average precision (mAP) of 98.1%, surpassing the performance of existing studies.\",\"PeriodicalId\":50910,\"journal\":{\"name\":\"ACM Transactions on Sensor Networks\",\"volume\":\"243 1\",\"pages\":\"0\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2023-10-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Sensor Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3628453\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Sensor Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3628453","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
End-to-End Target Liveness Detection via mmWave Radar and Vision Fusion for Autonomous Vehicles
The successful operation of autonomous vehicles hinges on their ability to accurately identify objects in their vicinity, particularly living targets such as bikers and pedestrians. However, visual interference inherent in real-world environments, such as omnipresent billboards, poses substantial challenges to extant vision-based detection technologies. These visual interference exhibit similar visual attributes to living targets, leading to erroneous identification. We address this problem by harnessing the capabilities of mmWave radar, a vital sensor in autonomous vehicles, in combination with vision technology, thereby contributing a unique solution for liveness target detection. We propose a methodology that extracts features from the mmWave radar signal to achieve end-to-end liveness target detection by integrating the mmWave radar and vision technology. This proposed methodology is implemented and evaluated on the commodity mmWave radar IWR6843ISK-ODS and vision sensor Logitech camera. Our extensive evaluation reveals that the proposed method accomplishes liveness target detection with a mean average precision (mAP) of 98.1%, surpassing the performance of existing studies.
期刊介绍:
ACM Transactions on Sensor Networks (TOSN) is a central publication by the ACM in the interdisciplinary area of sensor networks spanning a broad discipline from signal processing, networking and protocols, embedded systems, information management, to distributed algorithms. It covers research contributions that introduce new concepts, techniques, analyses, or architectures, as well as applied contributions that report on development of new tools and systems or experiences and experiments with high-impact, innovative applications. The Transactions places special attention on contributions to systemic approaches to sensor networks as well as fundamental contributions.