HD Lane Map Generation Based on Trail Map Aggregation

P. Colling, Dennis Müller, M. Rottmann
{"title":"HD Lane Map Generation Based on Trail Map Aggregation","authors":"P. Colling, Dennis Müller, M. Rottmann","doi":"10.1109/iv51971.2022.9827144","DOIUrl":null,"url":null,"abstract":"We present a procedure to create high definition maps of lanes based on detected and tracked vehicles from perception sensor data as well as the ego vehicle using multiple observations of the same location. The procedure consists of two parts. First, an aggregation part in which the detected and tracked road users as well as the driving path of the ego vehicle are aggregated into a map representation. Second, lanes are extracted from those maps as lane center lines in a structured data format. The final lane centers are represented in a directed graph representation including the driving direction. They are accurate up to a few centimeters. Our procedure is not restricted to any environment and does not rely on any prior map information. In our experiments with real world data and available ground truth, we study the performance of different map aggregations e.g., based on the ego vehicle only or based on other road users. Furthermore, we study the dependence on the number of data recording repetitions.","PeriodicalId":184622,"journal":{"name":"2022 IEEE Intelligent Vehicles Symposium (IV)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Intelligent Vehicles Symposium (IV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iv51971.2022.9827144","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

We present a procedure to create high definition maps of lanes based on detected and tracked vehicles from perception sensor data as well as the ego vehicle using multiple observations of the same location. The procedure consists of two parts. First, an aggregation part in which the detected and tracked road users as well as the driving path of the ego vehicle are aggregated into a map representation. Second, lanes are extracted from those maps as lane center lines in a structured data format. The final lane centers are represented in a directed graph representation including the driving direction. They are accurate up to a few centimeters. Our procedure is not restricted to any environment and does not rely on any prior map information. In our experiments with real world data and available ground truth, we study the performance of different map aggregations e.g., based on the ego vehicle only or based on other road users. Furthermore, we study the dependence on the number of data recording repetitions.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于路径地图聚合的高清车道地图生成
我们提出了一种基于感知传感器数据的检测和跟踪车辆以及使用同一位置的多个观测数据的自我车辆创建高清车道地图的程序。这个过程包括两个部分。首先是聚合部分,将检测和跟踪的道路使用者以及自我车辆的行驶路径聚合成地图表示。其次,以结构化数据格式从这些地图中提取车道作为车道中心线。最终车道中心用包含行驶方向的有向图表示。它们的精度可达几厘米。我们的程序不局限于任何环境,也不依赖于任何先前的地图信息。在我们使用真实世界数据和可用地面真相的实验中,我们研究了不同地图聚合的性能,例如,仅基于自我车辆或基于其他道路使用者。此外,我们还研究了对数据记录重复次数的依赖性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Dynamic Conflict Mitigation for Cooperative Driving Control of Intelligent Vehicles Detecting vehicles in the dark in urban environments - A human benchmark A Sequential Decision-theoretic Method for Detecting Mobile Robots Localization Failures Scene Spatio-Temporal Graph Convolutional Network for Pedestrian Intention Estimation What Can be Seen is What You Get: Structure Aware Point Cloud Augmentation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1