{"title":"改进了立体视觉相机与激光测距仪之间的自外部标定","authors":"Archana Khurana, K. S. Nagla","doi":"10.1080/19479832.2020.1727574","DOIUrl":null,"url":null,"abstract":"ABSTRACT This study identifies a way to accurately estimate extrinsic calibration parameters between stereo vision camera and 2D laser range finder (LRF) based on 3D reconstruction of monochromatic calibration board and geometric co-planarity constraints between the views from these two sensors. It supports automatic extraction of plane-line correspondences between camera and LRF using monochromatic board, which is further improved by selecting optimal threshold values for laser scan dissection to extract line features from LRF data. Calibration parameters are then obtained by solving co-planarity constraints between the estimated plane and line. Furthermore, the obtained parameters are refined by minimising reprojection error and error from the co-planarity constraints. Moreover, calibration accuracy is achieved because of extraction of reliable plane-line correspondence using monochromatic board which reduces the impact of range-reflectivity-bias observed in LRF data on checkerboard . As the proposed method supports to automatically extract feature correspondences, it provides a major reduction in time required from an operator in comparison to manual methods. The performance is validated by extensive experimentation and simulation, and estimated parameters from the proposed method demonstrate better accuracy than conventional methods.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2020-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/19479832.2020.1727574","citationCount":"3","resultStr":"{\"title\":\"Improved auto-extrinsic calibration between stereo vision camera and laser range finder\",\"authors\":\"Archana Khurana, K. S. Nagla\",\"doi\":\"10.1080/19479832.2020.1727574\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT This study identifies a way to accurately estimate extrinsic calibration parameters between stereo vision camera and 2D laser range finder (LRF) based on 3D reconstruction of monochromatic calibration board and geometric co-planarity constraints between the views from these two sensors. It supports automatic extraction of plane-line correspondences between camera and LRF using monochromatic board, which is further improved by selecting optimal threshold values for laser scan dissection to extract line features from LRF data. Calibration parameters are then obtained by solving co-planarity constraints between the estimated plane and line. Furthermore, the obtained parameters are refined by minimising reprojection error and error from the co-planarity constraints. Moreover, calibration accuracy is achieved because of extraction of reliable plane-line correspondence using monochromatic board which reduces the impact of range-reflectivity-bias observed in LRF data on checkerboard . As the proposed method supports to automatically extract feature correspondences, it provides a major reduction in time required from an operator in comparison to manual methods. The performance is validated by extensive experimentation and simulation, and estimated parameters from the proposed method demonstrate better accuracy than conventional methods.\",\"PeriodicalId\":46012,\"journal\":{\"name\":\"International Journal of Image and Data Fusion\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2020-02-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/19479832.2020.1727574\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Image and Data Fusion\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/19479832.2020.1727574\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"REMOTE SENSING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2020.1727574","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
Improved auto-extrinsic calibration between stereo vision camera and laser range finder
ABSTRACT This study identifies a way to accurately estimate extrinsic calibration parameters between stereo vision camera and 2D laser range finder (LRF) based on 3D reconstruction of monochromatic calibration board and geometric co-planarity constraints between the views from these two sensors. It supports automatic extraction of plane-line correspondences between camera and LRF using monochromatic board, which is further improved by selecting optimal threshold values for laser scan dissection to extract line features from LRF data. Calibration parameters are then obtained by solving co-planarity constraints between the estimated plane and line. Furthermore, the obtained parameters are refined by minimising reprojection error and error from the co-planarity constraints. Moreover, calibration accuracy is achieved because of extraction of reliable plane-line correspondence using monochromatic board which reduces the impact of range-reflectivity-bias observed in LRF data on checkerboard . As the proposed method supports to automatically extract feature correspondences, it provides a major reduction in time required from an operator in comparison to manual methods. The performance is validated by extensive experimentation and simulation, and estimated parameters from the proposed method demonstrate better accuracy than conventional methods.
期刊介绍:
International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).