{"title":"港口场景下自动驾驶汽车的环境感知和目标跟踪","authors":"Jiaying Lin, Lucas Koch, M. Kurowski, Jan-Jöran Gehrt, D. Abel, R. Zweigel","doi":"10.1109/ITSC45102.2020.9294618","DOIUrl":null,"url":null,"abstract":"Environmental perception is one of the critical aspects of autonomous driving for maritime applications, especially in fields of self-navigation and maneuver planning. For near-field recognition, this paper proposes a novel framework for data fusion, which can determine the occupied static space and track dynamic objects simultaneously. An unmanned surface vessel (USV) is equipped with LiDAR sensors, a GNSS receiver, and an Inertial Navigation System (INS). In the framework, the point cloud from LiDAR sensors is firstly clustered into various objects, then associated with known objects. After dynamic segmentation, the static objects are represented using an optimized occupancy grid map, and the dynamic objects are tracked and matched to the corresponding Automatic Identification System (AIS) messages. The proposed algorithms are validated with data collected from real-world tests, which are conducted in Rostock Harbor, Germany. After applying the proposed algorithm, the perceived test area can be represented with a 3D occupancy grid map with a 10 cm resolution. At the same time, dynamic objects in the view are detected and tracked successfully with an error of less than 10%. The plausibility of the results is qualitatively evaluated by comparing with Google Maps© and the corresponding AIS messages.","PeriodicalId":394538,"journal":{"name":"2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Environment Perception and Object Tracking for Autonomous Vehicles in a Harbor Scenario\",\"authors\":\"Jiaying Lin, Lucas Koch, M. Kurowski, Jan-Jöran Gehrt, D. Abel, R. Zweigel\",\"doi\":\"10.1109/ITSC45102.2020.9294618\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Environmental perception is one of the critical aspects of autonomous driving for maritime applications, especially in fields of self-navigation and maneuver planning. For near-field recognition, this paper proposes a novel framework for data fusion, which can determine the occupied static space and track dynamic objects simultaneously. An unmanned surface vessel (USV) is equipped with LiDAR sensors, a GNSS receiver, and an Inertial Navigation System (INS). In the framework, the point cloud from LiDAR sensors is firstly clustered into various objects, then associated with known objects. After dynamic segmentation, the static objects are represented using an optimized occupancy grid map, and the dynamic objects are tracked and matched to the corresponding Automatic Identification System (AIS) messages. The proposed algorithms are validated with data collected from real-world tests, which are conducted in Rostock Harbor, Germany. After applying the proposed algorithm, the perceived test area can be represented with a 3D occupancy grid map with a 10 cm resolution. At the same time, dynamic objects in the view are detected and tracked successfully with an error of less than 10%. The plausibility of the results is qualitatively evaluated by comparing with Google Maps© and the corresponding AIS messages.\",\"PeriodicalId\":394538,\"journal\":{\"name\":\"2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC)\",\"volume\":\"5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-09-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITSC45102.2020.9294618\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITSC45102.2020.9294618","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Environment Perception and Object Tracking for Autonomous Vehicles in a Harbor Scenario
Environmental perception is one of the critical aspects of autonomous driving for maritime applications, especially in fields of self-navigation and maneuver planning. For near-field recognition, this paper proposes a novel framework for data fusion, which can determine the occupied static space and track dynamic objects simultaneously. An unmanned surface vessel (USV) is equipped with LiDAR sensors, a GNSS receiver, and an Inertial Navigation System (INS). In the framework, the point cloud from LiDAR sensors is firstly clustered into various objects, then associated with known objects. After dynamic segmentation, the static objects are represented using an optimized occupancy grid map, and the dynamic objects are tracked and matched to the corresponding Automatic Identification System (AIS) messages. The proposed algorithms are validated with data collected from real-world tests, which are conducted in Rostock Harbor, Germany. After applying the proposed algorithm, the perceived test area can be represented with a 3D occupancy grid map with a 10 cm resolution. At the same time, dynamic objects in the view are detected and tracked successfully with an error of less than 10%. The plausibility of the results is qualitatively evaluated by comparing with Google Maps© and the corresponding AIS messages.