F-PCNet: A New Fast Object Detection Method Based on Point Cloud Only*

Zhiwei Xing, Lu Li, Runchao Ye, Jintao Wang, Xiaorui Zhu, Junting Lv
{"title":"F-PCNet: A New Fast Object Detection Method Based on Point Cloud Only*","authors":"Zhiwei Xing, Lu Li, Runchao Ye, Jintao Wang, Xiaorui Zhu, Junting Lv","doi":"10.1109/RCAR54675.2022.9872254","DOIUrl":null,"url":null,"abstract":"Although deep learning methods have greatly improved the accuracy of the object detection tasks, it is still challenging to balance the efficiency and accuracy of the algorithms under circumstances of point clouds only. In this paper, an anchor-free one-stage deep neural network, F-PCNet, is proposed to realize real-time detection based on point clouds on an autonomous driving platform while maintaining high accuracy. The proposed network takes the bird’s eye view of point clouds collected by LiDAR as input, and outputs the category and 2D bounding box of each detected object. The backbone of F-PCNet is composed of residual network modules of different sizes which effectively reduce the impact of learning degradation. The anchor-free detection head enables F-PCNet to achieve high levels of accuracy and efficiency. Experimental results show that F-PCNet achieves high detection accuracy in a short time consumption and is suitable for real-time detecting scenarios.","PeriodicalId":304963,"journal":{"name":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","volume":"47 23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RCAR54675.2022.9872254","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Although deep learning methods have greatly improved the accuracy of the object detection tasks, it is still challenging to balance the efficiency and accuracy of the algorithms under circumstances of point clouds only. In this paper, an anchor-free one-stage deep neural network, F-PCNet, is proposed to realize real-time detection based on point clouds on an autonomous driving platform while maintaining high accuracy. The proposed network takes the bird’s eye view of point clouds collected by LiDAR as input, and outputs the category and 2D bounding box of each detected object. The backbone of F-PCNet is composed of residual network modules of different sizes which effectively reduce the impact of learning degradation. The anchor-free detection head enables F-PCNet to achieve high levels of accuracy and efficiency. Experimental results show that F-PCNet achieves high detection accuracy in a short time consumption and is suitable for real-time detecting scenarios.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
F-PCNet:一种新的仅基于点云的快速目标检测方法*
尽管深度学习方法大大提高了目标检测任务的准确性,但仅在点云情况下,如何平衡算法的效率和准确性仍然是一个挑战。本文提出了一种无锚点的一级深度神经网络F-PCNet,在自动驾驶平台上实现基于点云的实时检测,同时保持高精度。该网络以LiDAR采集的点云鸟瞰图为输入,输出每个被检测物体的类别和二维边界框。F-PCNet的骨干网络由不同大小的残差网络模块组成,有效地降低了学习退化的影响。无锚检测头使F-PCNet能够实现高水平的精度和效率。实验结果表明,F-PCNet在较短的时间内实现了较高的检测精度,适用于实时检测场景。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Depth Recognition of Hard Inclusions in Tissue Phantoms for Robotic Palpation Design of a Miniaturized Magnetic Actuation System for Motion Control of Micro/Nano Swimming Robots Energy Shaping Based Nonlinear Anti-Swing Controller for Double-Pendulum Rotary Crane with Distributed-Mass Beams RCAR 2022 Cover Page Design and Implementation of Robot Middleware Service Integration Framework Based on DDS
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1