Moritz Knorr, José Esparza, W. Niehsen, C. Stiller
{"title":"基于重叠视场的鱼眼多相机外部标定","authors":"Moritz Knorr, José Esparza, W. Niehsen, C. Stiller","doi":"10.1109/IVS.2014.6856403","DOIUrl":null,"url":null,"abstract":"It is well known that the robustness of many computer vision algorithms can be improved by employing large field of view cameras, such as omnidirectional cameras. To avoid obstructions in the field of view, such cameras need to be mounted in an exposed position. Alternatively, a multicamera setup can be used. However, this requires the extrinsic calibration to be known. In the present work, we propose a method to calibrate a fisheye multi-camera rig, mounted on a mobile platform. The method only relies on feature correspondences from pairwise overlapping fields of view of adjacent cameras. In contrast to existing approaches, motion estimation or specific motion patterns are not required. To compensate for the large extent of multi-camera setups and corresponding viewpoint variations, as well as geometrical distortions caused by fisheye lenses, captured images are mapped into virtual camera views such that corresponding image regions coincide. To this end, the scene geometry is approximated by the ground plane in close proximity and by infinitely far away objects elsewhere. As a result, low complexity feature detectors and matchers can be employed. The approach is evaluated using a setup of four rigidly coupled and synchronized wide angle fisheye cameras that were attached to four sides of a mobile platform. The cameras have pairwise overlapping fields of view and baselines between 2.25 and 3 meters.","PeriodicalId":254500,"journal":{"name":"2014 IEEE Intelligent Vehicles Symposium Proceedings","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Extrinsic calibration of a fisheye multi-camera setup using overlapping fields of view\",\"authors\":\"Moritz Knorr, José Esparza, W. Niehsen, C. Stiller\",\"doi\":\"10.1109/IVS.2014.6856403\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It is well known that the robustness of many computer vision algorithms can be improved by employing large field of view cameras, such as omnidirectional cameras. To avoid obstructions in the field of view, such cameras need to be mounted in an exposed position. Alternatively, a multicamera setup can be used. However, this requires the extrinsic calibration to be known. In the present work, we propose a method to calibrate a fisheye multi-camera rig, mounted on a mobile platform. The method only relies on feature correspondences from pairwise overlapping fields of view of adjacent cameras. In contrast to existing approaches, motion estimation or specific motion patterns are not required. To compensate for the large extent of multi-camera setups and corresponding viewpoint variations, as well as geometrical distortions caused by fisheye lenses, captured images are mapped into virtual camera views such that corresponding image regions coincide. To this end, the scene geometry is approximated by the ground plane in close proximity and by infinitely far away objects elsewhere. As a result, low complexity feature detectors and matchers can be employed. The approach is evaluated using a setup of four rigidly coupled and synchronized wide angle fisheye cameras that were attached to four sides of a mobile platform. The cameras have pairwise overlapping fields of view and baselines between 2.25 and 3 meters.\",\"PeriodicalId\":254500,\"journal\":{\"name\":\"2014 IEEE Intelligent Vehicles Symposium Proceedings\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-06-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 IEEE Intelligent Vehicles Symposium Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IVS.2014.6856403\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Intelligent Vehicles Symposium Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IVS.2014.6856403","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Extrinsic calibration of a fisheye multi-camera setup using overlapping fields of view
It is well known that the robustness of many computer vision algorithms can be improved by employing large field of view cameras, such as omnidirectional cameras. To avoid obstructions in the field of view, such cameras need to be mounted in an exposed position. Alternatively, a multicamera setup can be used. However, this requires the extrinsic calibration to be known. In the present work, we propose a method to calibrate a fisheye multi-camera rig, mounted on a mobile platform. The method only relies on feature correspondences from pairwise overlapping fields of view of adjacent cameras. In contrast to existing approaches, motion estimation or specific motion patterns are not required. To compensate for the large extent of multi-camera setups and corresponding viewpoint variations, as well as geometrical distortions caused by fisheye lenses, captured images are mapped into virtual camera views such that corresponding image regions coincide. To this end, the scene geometry is approximated by the ground plane in close proximity and by infinitely far away objects elsewhere. As a result, low complexity feature detectors and matchers can be employed. The approach is evaluated using a setup of four rigidly coupled and synchronized wide angle fisheye cameras that were attached to four sides of a mobile platform. The cameras have pairwise overlapping fields of view and baselines between 2.25 and 3 meters.