集成红外热像仪和 3D 激光雷达的外部校准方法

IF 1.6 4区 工程技术 Q3 INSTRUMENTS & INSTRUMENTATION Sensor Review Pub Date : 2024-06-04 DOI:10.1108/sr-04-2024-0292
Dan Zhang, Junji Yuan, Haibin Meng, Wei Wang, Rui He, Sen Li
{"title":"集成红外热像仪和 3D 激光雷达的外部校准方法","authors":"Dan Zhang, Junji Yuan, Haibin Meng, Wei Wang, Rui He, Sen Li","doi":"10.1108/sr-04-2024-0292","DOIUrl":null,"url":null,"abstract":"<h3>Purpose</h3>\n<p>In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.</p><!--/ Abstract__block -->\n<h3>Design/methodology/approach</h3>\n<p>Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.</p><!--/ Abstract__block -->\n<h3>Findings</h3>\n<p>The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.</p><!--/ Abstract__block -->\n<h3>Originality/value</h3>\n<p>This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.</p><!--/ Abstract__block -->","PeriodicalId":49540,"journal":{"name":"Sensor Review","volume":"40 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Extrinsic calibration method for integrating infrared thermal imaging camera and 3D LiDAR\",\"authors\":\"Dan Zhang, Junji Yuan, Haibin Meng, Wei Wang, Rui He, Sen Li\",\"doi\":\"10.1108/sr-04-2024-0292\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3>Purpose</h3>\\n<p>In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.</p><!--/ Abstract__block -->\\n<h3>Design/methodology/approach</h3>\\n<p>Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.</p><!--/ Abstract__block -->\\n<h3>Findings</h3>\\n<p>The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.</p><!--/ Abstract__block -->\\n<h3>Originality/value</h3>\\n<p>This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.</p><!--/ Abstract__block -->\",\"PeriodicalId\":49540,\"journal\":{\"name\":\"Sensor Review\",\"volume\":\"40 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-06-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sensor Review\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1108/sr-04-2024-0292\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"INSTRUMENTS & INSTRUMENTATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensor Review","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1108/sr-04-2024-0292","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0

摘要

目的 在建筑物内发生火灾事故时,消防机器人对现场的高效感知尤为重要。虽然单个传感器可以提供特定类型的数据,但要在多个传感器之间实现深度数据关联则是一项挑战。为解决这一问题,本研究旨在探索一种融合热成像摄像机和激光雷达传感器的方法,以增强消防机器人在火灾环境中的感知能力。本文提出了一种基于刚体转换的外部校准方法。使用 Ceres 优化算法对收集到的数据进行优化,以获得精确的校准参数。在此校准的基础上,提出了一种基于坐标投影变换的传感器融合方法,实现了图像和点云之间的实时映射。此外,还在实验性烟雾弥漫的火灾环境中验证了所提出的融合设备数据收集的有效性。研究结果基于刚体变换的外校准方法获得的平均重投影误差为 1.02 像素,表明精度良好。融合数据结合了红外热像仪和激光雷达的优势,克服了单个传感器的局限性。 原创性/价值 本文介绍了一种基于刚体变换的外校准方法,以及一种基于坐标投影变换的传感器融合方法。这种融合策略的有效性在模拟火灾环境中得到了验证。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Extrinsic calibration method for integrating infrared thermal imaging camera and 3D LiDAR

Purpose

In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.

Design/methodology/approach

Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.

Findings

The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.

Originality/value

This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Sensor Review
Sensor Review 工程技术-仪器仪表
CiteScore
3.40
自引率
6.20%
发文量
50
审稿时长
3.7 months
期刊介绍: Sensor Review publishes peer reviewed state-of-the-art articles and specially commissioned technology reviews. Each issue of this multidisciplinary journal includes high quality original content covering all aspects of sensors and their applications, and reflecting the most interesting and strategically important research and development activities from around the world. Because of this, readers can stay at the very forefront of high technology sensor developments. Emphasis is placed on detailed independent regular and review articles identifying the full range of sensors currently available for specific applications, as well as highlighting those areas of technology showing great potential for the future. The journal encourages authors to consider the practical and social implications of their articles. All articles undergo a rigorous double-blind peer review process which involves an initial assessment of suitability of an article for the journal followed by sending it to, at least two reviewers in the field if deemed suitable. Sensor Review’s coverage includes, but is not restricted to: Mechanical sensors – position, displacement, proximity, velocity, acceleration, vibration, force, torque, pressure, and flow sensors Electric and magnetic sensors – resistance, inductive, capacitive, piezoelectric, eddy-current, electromagnetic, photoelectric, and thermoelectric sensors Temperature sensors, infrared sensors, humidity sensors Optical, electro-optical and fibre-optic sensors and systems, photonic sensors Biosensors, wearable and implantable sensors and systems, immunosensors Gas and chemical sensors and systems, polymer sensors Acoustic and ultrasonic sensors Haptic sensors and devices Smart and intelligent sensors and systems Nanosensors, NEMS, MEMS, and BioMEMS Quantum sensors Sensor systems: sensor data fusion, signals, processing and interfacing, signal conditioning.
期刊最新文献
Multi-sensor integration on one microfluidics chip for single-stranded DNA detection Advances in drift compensation algorithms for electronic nose technology A novel Au-NPs/DBTTA nanocomposite-based electrochemical sensor for the detection of ascorbic acid (AA) A step length estimation method based on frequency domain feature analysis and gait recognition for pedestrian dead reckoning Liquid viscosity measurement based on disk-shaped electromechanical resonator
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1