Tracking and Estimation Approach for Human-Aware Mobile Robot Navigation

IF 2.2 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Sensors Letters Pub Date : 2024-11-06 DOI:10.1109/LSENS.2024.3492373
Winston Doss Marveldoss;Bandaru Joshika;Bijo Sebastian
{"title":"Tracking and Estimation Approach for Human-Aware Mobile Robot Navigation","authors":"Winston Doss Marveldoss;Bandaru Joshika;Bijo Sebastian","doi":"10.1109/LSENS.2024.3492373","DOIUrl":null,"url":null,"abstract":"Accurate perception of the environment, including the detection and tracking of humans, is essential for safe navigation of mobile robots in human-centric environments. Existing State-of-the-Art techniques rely on high-performance sensors. This leads to expensive robotic systems, which limits the large-scale deployment of autonomous mobile robots in social spaces. In this letter, we propose and validate a novel human tracking and estimation approach that relies on a low-cost 2-D LiDAR and a monocular camera. The proposed approach leverages the capabilities of each sensor by relying on the camera for human detection and the LiDAR for human pose estimation. Precise calibration and registration of the sensor frames allow for data association in the presence of multiple human targets. Human detection and pose estimation data from the sensor suite are used as measurement by an extended Kalman filter, which allows for effective tracking over multiple frames, even in the presence of occlusion. The overall approach addresses the limitations of each individual sensor without increasing the overall cost of the sensor suite. Tracking and estimation performance for the proposed approach was evaluated on experimental trails in real-world conditions with artificial markers as ground truth for each human target. The results demonstrate satisfactory performance for the proposed approach to be used in human-aware autonomous navigation in real-world settings.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 12","pages":"1-4"},"PeriodicalIF":2.2000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10745641/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Accurate perception of the environment, including the detection and tracking of humans, is essential for safe navigation of mobile robots in human-centric environments. Existing State-of-the-Art techniques rely on high-performance sensors. This leads to expensive robotic systems, which limits the large-scale deployment of autonomous mobile robots in social spaces. In this letter, we propose and validate a novel human tracking and estimation approach that relies on a low-cost 2-D LiDAR and a monocular camera. The proposed approach leverages the capabilities of each sensor by relying on the camera for human detection and the LiDAR for human pose estimation. Precise calibration and registration of the sensor frames allow for data association in the presence of multiple human targets. Human detection and pose estimation data from the sensor suite are used as measurement by an extended Kalman filter, which allows for effective tracking over multiple frames, even in the presence of occlusion. The overall approach addresses the limitations of each individual sensor without increasing the overall cost of the sensor suite. Tracking and estimation performance for the proposed approach was evaluated on experimental trails in real-world conditions with artificial markers as ground truth for each human target. The results demonstrate satisfactory performance for the proposed approach to be used in human-aware autonomous navigation in real-world settings.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
人类感知移动机器人导航的跟踪和估计方法
准确感知环境,包括探测和跟踪人类,对于移动机器人在以人为中心的环境中安全导航至关重要。现有的先进技术依赖于高性能传感器。这导致机器人系统价格昂贵,限制了自主移动机器人在社会空间中的大规模部署。在这封信中,我们提出并验证了一种新型的人类跟踪和估计方法,该方法依赖于低成本的二维激光雷达和单目摄像头。所提出的方法充分利用了每个传感器的能力,依靠摄像头进行人体检测,依靠激光雷达进行人体姿态估计。传感器帧的精确校准和注册允许在存在多个人类目标的情况下进行数据关联。来自传感器套件的人体检测和姿势估计数据被用作扩展卡尔曼滤波器的测量数据,这样即使在有遮挡的情况下,也能对多个帧进行有效跟踪。整个方法在不增加传感器套件总体成本的情况下,解决了每个传感器的局限性。在真实世界条件下的实验轨迹上,以人工标记作为每个人类目标的地面实况,对所提出方法的跟踪和估计性能进行了评估。结果表明,所提方法的性能令人满意,可用于真实世界环境中的人类感知自主导航。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Sensors Letters
IEEE Sensors Letters Engineering-Electrical and Electronic Engineering
CiteScore
3.50
自引率
7.10%
发文量
194
期刊最新文献
Front Cover IEEE Sensors Council Information Table of Contents IEEE Sensors Letters Subject Categories for Article Numbering Information IEEE Sensors Letters Publication Information
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1