大型多传感器工业机器人单元的视觉标记引导点云配准

Erind Ujkani, J. Dybedal, Atle Aalerud, Knut B. Kaldestad, G. Hovland
{"title":"大型多传感器工业机器人单元的视觉标记引导点云配准","authors":"Erind Ujkani, J. Dybedal, Atle Aalerud, Knut B. Kaldestad, G. Hovland","doi":"10.1109/MESA.2018.8449195","DOIUrl":null,"url":null,"abstract":"This paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. The sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on the time-of-flight principle. The approach taken was based on a novel procedure combining Aruco visual markers, methods using region of interest and iterative closest point. The calibration of sensors is performed pairwise, exploiting the fact that time-of-flight sensors can have some overlap in the generated point cloud data. For a volume measuring 10m × 14m × 5m a typical accuracy of the generated point cloud data of 5–10cm was achieved using six sensor nodes.","PeriodicalId":138936,"journal":{"name":"2018 14th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Visual Marker Guided Point Cloud Registration in a Large Multi-Sensor Industrial Robot Cell\",\"authors\":\"Erind Ujkani, J. Dybedal, Atle Aalerud, Knut B. Kaldestad, G. Hovland\",\"doi\":\"10.1109/MESA.2018.8449195\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. The sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on the time-of-flight principle. The approach taken was based on a novel procedure combining Aruco visual markers, methods using region of interest and iterative closest point. The calibration of sensors is performed pairwise, exploiting the fact that time-of-flight sensors can have some overlap in the generated point cloud data. For a volume measuring 10m × 14m × 5m a typical accuracy of the generated point cloud data of 5–10cm was achieved using six sensor nodes.\",\"PeriodicalId\":138936,\"journal\":{\"name\":\"2018 14th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 14th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MESA.2018.8449195\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 14th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MESA.2018.8449195","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

本文给出了大型工业机器人单元三维传感器标定的基准和精度分析。使用的传感器是Kinect v2,它包含一个RGB和一个基于飞行时间原理测量深度的红外相机。所采取的方法是基于一种结合Aruco视觉标记、使用感兴趣区域和迭代最近点的方法的新方法。传感器的校准是成对进行的,利用了飞行时间传感器在生成的点云数据中可能有一些重叠的事实。对于尺寸为10m × 14m × 5m的体积,使用6个传感器节点生成的点云数据的典型精度为5-10cm。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Visual Marker Guided Point Cloud Registration in a Large Multi-Sensor Industrial Robot Cell
This paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. The sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on the time-of-flight principle. The approach taken was based on a novel procedure combining Aruco visual markers, methods using region of interest and iterative closest point. The calibration of sensors is performed pairwise, exploiting the fact that time-of-flight sensors can have some overlap in the generated point cloud data. For a volume measuring 10m × 14m × 5m a typical accuracy of the generated point cloud data of 5–10cm was achieved using six sensor nodes.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
The sensing technology of applying the acoustic emission sensor to the grinding wheel loading phenomenon Lateral control approach of powered parafoils combining wind feedforward compensation with active disturbance rejection control Effects of DAC interpolation on the dynamics of a high speed linear actuator Wearable Device to Record Hand Motions based on EMG and Visual Information A Smooth Traction Control Design for Two-Wheeled electric vehicles
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1