Winston Doss Marveldoss;Bandaru Joshika;Bijo Sebastian
{"title":"Tracking and Estimation Approach for Human-Aware Mobile Robot Navigation","authors":"Winston Doss Marveldoss;Bandaru Joshika;Bijo Sebastian","doi":"10.1109/LSENS.2024.3492373","DOIUrl":null,"url":null,"abstract":"Accurate perception of the environment, including the detection and tracking of humans, is essential for safe navigation of mobile robots in human-centric environments. Existing State-of-the-Art techniques rely on high-performance sensors. This leads to expensive robotic systems, which limits the large-scale deployment of autonomous mobile robots in social spaces. In this letter, we propose and validate a novel human tracking and estimation approach that relies on a low-cost 2-D LiDAR and a monocular camera. The proposed approach leverages the capabilities of each sensor by relying on the camera for human detection and the LiDAR for human pose estimation. Precise calibration and registration of the sensor frames allow for data association in the presence of multiple human targets. Human detection and pose estimation data from the sensor suite are used as measurement by an extended Kalman filter, which allows for effective tracking over multiple frames, even in the presence of occlusion. The overall approach addresses the limitations of each individual sensor without increasing the overall cost of the sensor suite. Tracking and estimation performance for the proposed approach was evaluated on experimental trails in real-world conditions with artificial markers as ground truth for each human target. The results demonstrate satisfactory performance for the proposed approach to be used in human-aware autonomous navigation in real-world settings.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 12","pages":"1-4"},"PeriodicalIF":2.2000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10745641/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate perception of the environment, including the detection and tracking of humans, is essential for safe navigation of mobile robots in human-centric environments. Existing State-of-the-Art techniques rely on high-performance sensors. This leads to expensive robotic systems, which limits the large-scale deployment of autonomous mobile robots in social spaces. In this letter, we propose and validate a novel human tracking and estimation approach that relies on a low-cost 2-D LiDAR and a monocular camera. The proposed approach leverages the capabilities of each sensor by relying on the camera for human detection and the LiDAR for human pose estimation. Precise calibration and registration of the sensor frames allow for data association in the presence of multiple human targets. Human detection and pose estimation data from the sensor suite are used as measurement by an extended Kalman filter, which allows for effective tracking over multiple frames, even in the presence of occlusion. The overall approach addresses the limitations of each individual sensor without increasing the overall cost of the sensor suite. Tracking and estimation performance for the proposed approach was evaluated on experimental trails in real-world conditions with artificial markers as ground truth for each human target. The results demonstrate satisfactory performance for the proposed approach to be used in human-aware autonomous navigation in real-world settings.