Dan Zhang, Junji Yuan, Haibin Meng, Wei Wang, Rui He, Sen Li
{"title":"集成红外热像仪和 3D 激光雷达的外部校准方法","authors":"Dan Zhang, Junji Yuan, Haibin Meng, Wei Wang, Rui He, Sen Li","doi":"10.1108/sr-04-2024-0292","DOIUrl":null,"url":null,"abstract":"<h3>Purpose</h3>\n<p>In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.</p><!--/ Abstract__block -->\n<h3>Design/methodology/approach</h3>\n<p>Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.</p><!--/ Abstract__block -->\n<h3>Findings</h3>\n<p>The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.</p><!--/ Abstract__block -->\n<h3>Originality/value</h3>\n<p>This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.</p><!--/ Abstract__block -->","PeriodicalId":49540,"journal":{"name":"Sensor Review","volume":"40 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Extrinsic calibration method for integrating infrared thermal imaging camera and 3D LiDAR\",\"authors\":\"Dan Zhang, Junji Yuan, Haibin Meng, Wei Wang, Rui He, Sen Li\",\"doi\":\"10.1108/sr-04-2024-0292\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3>Purpose</h3>\\n<p>In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.</p><!--/ Abstract__block -->\\n<h3>Design/methodology/approach</h3>\\n<p>Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.</p><!--/ Abstract__block -->\\n<h3>Findings</h3>\\n<p>The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.</p><!--/ Abstract__block -->\\n<h3>Originality/value</h3>\\n<p>This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.</p><!--/ Abstract__block -->\",\"PeriodicalId\":49540,\"journal\":{\"name\":\"Sensor Review\",\"volume\":\"40 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-06-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sensor Review\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1108/sr-04-2024-0292\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"INSTRUMENTS & INSTRUMENTATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensor Review","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1108/sr-04-2024-0292","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
Extrinsic calibration method for integrating infrared thermal imaging camera and 3D LiDAR
Purpose
In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.
Design/methodology/approach
Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.
Findings
The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.
Originality/value
This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.
期刊介绍:
Sensor Review publishes peer reviewed state-of-the-art articles and specially commissioned technology reviews. Each issue of this multidisciplinary journal includes high quality original content covering all aspects of sensors and their applications, and reflecting the most interesting and strategically important research and development activities from around the world. Because of this, readers can stay at the very forefront of high technology sensor developments.
Emphasis is placed on detailed independent regular and review articles identifying the full range of sensors currently available for specific applications, as well as highlighting those areas of technology showing great potential for the future. The journal encourages authors to consider the practical and social implications of their articles.
All articles undergo a rigorous double-blind peer review process which involves an initial assessment of suitability of an article for the journal followed by sending it to, at least two reviewers in the field if deemed suitable.
Sensor Review’s coverage includes, but is not restricted to:
Mechanical sensors – position, displacement, proximity, velocity, acceleration, vibration, force, torque, pressure, and flow sensors
Electric and magnetic sensors – resistance, inductive, capacitive, piezoelectric, eddy-current, electromagnetic, photoelectric, and thermoelectric sensors
Temperature sensors, infrared sensors, humidity sensors
Optical, electro-optical and fibre-optic sensors and systems, photonic sensors
Biosensors, wearable and implantable sensors and systems, immunosensors
Gas and chemical sensors and systems, polymer sensors
Acoustic and ultrasonic sensors
Haptic sensors and devices
Smart and intelligent sensors and systems
Nanosensors, NEMS, MEMS, and BioMEMS
Quantum sensors
Sensor systems: sensor data fusion, signals, processing and interfacing, signal conditioning.