多数据传感器融合框架检测透明目标,为移动机器人的高效测绘提供依据

Ravinder Singh, K. S. Nagla
{"title":"多数据传感器融合框架检测透明目标,为移动机器人的高效测绘提供依据","authors":"Ravinder Singh, K. S. Nagla","doi":"10.1108/IJIUS-05-2018-0013","DOIUrl":null,"url":null,"abstract":"\nPurpose\nAn efficient perception of the complex environment is the foremost requirement in mobile robotics. At present, the utilization of glass as a glass wall and automated transparent door in the modern building has become a highlight feature for interior decoration, which has resulted in the wrong perception of the environment by various range sensors. The perception generated by multi-data sensor fusion (MDSF) of sonar and laser is fairly consistent to detect glass but is still affected by the issues such as sensor inaccuracies, sensor reliability, scan mismatching due to glass, sensor model, probabilistic approaches for sensor fusion, sensor registration, etc. The paper aims to discuss these issues.\n\n\nDesign/methodology/approach\nThis paper presents a modified framework – Advanced Laser and Sonar Framework (ALSF) – to fuse the sensory information of a laser scanner and sonar to reduce the uncertainty caused by glass in an environment by selecting the optimal range information corresponding to a selected threshold value. In the proposed approach, the conventional sonar sensor model is also modified to reduce the wrong perception in sonar as an outcome of the diverse range measurement. The laser scan matching algorithm is also modified by taking out the small cluster of laser point (w.r.t. range information) to get efficient perception.\n\n\nFindings\nThe probability of the occupied cells w.r.t. the modified sonar sensor model becomes consistent corresponding to diverse sonar range measurement. The scan matching technique is also modified to reduce the uncertainty caused by glass and high computational load for the efficient and fast pose estimation of the laser sensor/mobile robot to generate robust mapping. These stated modifications are linked with the proposed ALSF technique to reduce the uncertainty caused by glass, inconsistent probabilities and high load computation during the generation of occupancy grid mapping with MDSF. Various real-world experiments are performed with the implementation of the proposed approach on a mobile robot fitted with laser and sonar, and the obtained results are qualitatively and quantitatively compared with conventional approaches.\n\n\nOriginality/value\nThe proposed ASIF approach generates efficient perception of the complex environment contains glass and can be implemented for various robotics applications.\n","PeriodicalId":42876,"journal":{"name":"International Journal of Intelligent Unmanned Systems","volume":" ","pages":""},"PeriodicalIF":0.8000,"publicationDate":"2019-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1108/IJIUS-05-2018-0013","citationCount":"14","resultStr":"{\"title\":\"Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot mapping\",\"authors\":\"Ravinder Singh, K. S. Nagla\",\"doi\":\"10.1108/IJIUS-05-2018-0013\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\nPurpose\\nAn efficient perception of the complex environment is the foremost requirement in mobile robotics. At present, the utilization of glass as a glass wall and automated transparent door in the modern building has become a highlight feature for interior decoration, which has resulted in the wrong perception of the environment by various range sensors. The perception generated by multi-data sensor fusion (MDSF) of sonar and laser is fairly consistent to detect glass but is still affected by the issues such as sensor inaccuracies, sensor reliability, scan mismatching due to glass, sensor model, probabilistic approaches for sensor fusion, sensor registration, etc. The paper aims to discuss these issues.\\n\\n\\nDesign/methodology/approach\\nThis paper presents a modified framework – Advanced Laser and Sonar Framework (ALSF) – to fuse the sensory information of a laser scanner and sonar to reduce the uncertainty caused by glass in an environment by selecting the optimal range information corresponding to a selected threshold value. In the proposed approach, the conventional sonar sensor model is also modified to reduce the wrong perception in sonar as an outcome of the diverse range measurement. The laser scan matching algorithm is also modified by taking out the small cluster of laser point (w.r.t. range information) to get efficient perception.\\n\\n\\nFindings\\nThe probability of the occupied cells w.r.t. the modified sonar sensor model becomes consistent corresponding to diverse sonar range measurement. The scan matching technique is also modified to reduce the uncertainty caused by glass and high computational load for the efficient and fast pose estimation of the laser sensor/mobile robot to generate robust mapping. These stated modifications are linked with the proposed ALSF technique to reduce the uncertainty caused by glass, inconsistent probabilities and high load computation during the generation of occupancy grid mapping with MDSF. Various real-world experiments are performed with the implementation of the proposed approach on a mobile robot fitted with laser and sonar, and the obtained results are qualitatively and quantitatively compared with conventional approaches.\\n\\n\\nOriginality/value\\nThe proposed ASIF approach generates efficient perception of the complex environment contains glass and can be implemented for various robotics applications.\\n\",\"PeriodicalId\":42876,\"journal\":{\"name\":\"International Journal of Intelligent Unmanned Systems\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2019-01-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1108/IJIUS-05-2018-0013\",\"citationCount\":\"14\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Intelligent Unmanned Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1108/IJIUS-05-2018-0013\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ROBOTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Intelligent Unmanned Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/IJIUS-05-2018-0013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 14

摘要

目的对复杂环境的有效感知是移动机器人技术的首要要求。目前,在现代建筑中使用玻璃作为玻璃墙和自动透明门已成为室内装饰的一大亮点,导致各种测距传感器对环境的错误感知。声纳和激光的多数据传感器融合(MDSF)产生的感知对检测玻璃是相当一致的,但仍然受到传感器不准确性、传感器可靠性、玻璃扫描不匹配、传感器模型、传感器融合的概率方法、传感器配准等问题的影响。本文旨在探讨这些问题。设计/方法/方法本文提出了一个改进的框架-先进激光和声纳框架(ALSF) -融合激光扫描仪和声纳的感官信息,通过选择与选定阈值相对应的最佳距离信息来减少环境中玻璃引起的不确定性。在该方法中,还对传统的声纳传感器模型进行了改进,以减少声纳由于测量距离不同而产生的错误感知。对激光扫描匹配算法进行了改进,去掉了激光点的小簇(w.r.t.距离信息)以获得有效的感知。结果改进的声纳传感器模型在不同声纳测量距离下,被占领单元的概率趋于一致。同时对扫描匹配技术进行了改进,以减少玻璃和高计算量带来的不确定性,从而有效快速地估计激光传感器/移动机器人的位姿,生成鲁棒映射。这些修改与所提出的ALSF技术相关联,以减少在使用MDSF生成占用网格映射时由玻璃、不一致概率和高负载计算引起的不确定性。将该方法应用于一个安装了激光和声纳的移动机器人上进行了各种实际实验,并与传统方法进行了定性和定量比较。原创性/价值所提出的ASIF方法可以对包含玻璃的复杂环境产生有效的感知,并且可以用于各种机器人应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot mapping
Purpose An efficient perception of the complex environment is the foremost requirement in mobile robotics. At present, the utilization of glass as a glass wall and automated transparent door in the modern building has become a highlight feature for interior decoration, which has resulted in the wrong perception of the environment by various range sensors. The perception generated by multi-data sensor fusion (MDSF) of sonar and laser is fairly consistent to detect glass but is still affected by the issues such as sensor inaccuracies, sensor reliability, scan mismatching due to glass, sensor model, probabilistic approaches for sensor fusion, sensor registration, etc. The paper aims to discuss these issues. Design/methodology/approach This paper presents a modified framework – Advanced Laser and Sonar Framework (ALSF) – to fuse the sensory information of a laser scanner and sonar to reduce the uncertainty caused by glass in an environment by selecting the optimal range information corresponding to a selected threshold value. In the proposed approach, the conventional sonar sensor model is also modified to reduce the wrong perception in sonar as an outcome of the diverse range measurement. The laser scan matching algorithm is also modified by taking out the small cluster of laser point (w.r.t. range information) to get efficient perception. Findings The probability of the occupied cells w.r.t. the modified sonar sensor model becomes consistent corresponding to diverse sonar range measurement. The scan matching technique is also modified to reduce the uncertainty caused by glass and high computational load for the efficient and fast pose estimation of the laser sensor/mobile robot to generate robust mapping. These stated modifications are linked with the proposed ALSF technique to reduce the uncertainty caused by glass, inconsistent probabilities and high load computation during the generation of occupancy grid mapping with MDSF. Various real-world experiments are performed with the implementation of the proposed approach on a mobile robot fitted with laser and sonar, and the obtained results are qualitatively and quantitatively compared with conventional approaches. Originality/value The proposed ASIF approach generates efficient perception of the complex environment contains glass and can be implemented for various robotics applications.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.50
自引率
0.00%
发文量
21
期刊最新文献
Design of hexacopter and finite element analysis for material selection Towards a novel cyber physical control system framework: a deep learning driven use case Employing a multi-sensor fusion array to detect objects for an orbital transfer vehicle to remove space debris Communication via quad/hexa-copters during disasters Nonlinear optimal control for UAVs with tilting rotors
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1