{"title":"Slasher: Stadium racer car for event camera end-to-end learning autonomous driving experiments","authors":"Yuhuang Hu, Hong Ming Chen, T. Delbrück","doi":"10.1109/AICAS.2019.8771520","DOIUrl":null,"url":null,"abstract":"Slasher is the first open 1/10 scale autonomous driving platform for exploring the use of neuromorphic event cameras for fast driving in unstructured indoor and outdoor environments. Slasher features a DAVIS event-based camera and ROS computer for perception and control. The DAVIS camera provides high dynamic range, sparse output, and sub-millisecond latency output for the quick visual control needed for fast driving. A race controller and Bluetooth remote joystick are used to coordinate different processing pipelines, and a low-cost ultra-wide-band (UWB) positioning system records trajectories. The modular design of Slasher can easily integrate additional features and sensors. In this paper, we show its application in a reflexive Convolutional Neural Network (CNN) steering controller trained by end-to-end learning. We present preliminary experiments in closed-loop indoor and outdoor trail driving.","PeriodicalId":273095,"journal":{"name":"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AICAS.2019.8771520","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Slasher is the first open 1/10 scale autonomous driving platform for exploring the use of neuromorphic event cameras for fast driving in unstructured indoor and outdoor environments. Slasher features a DAVIS event-based camera and ROS computer for perception and control. The DAVIS camera provides high dynamic range, sparse output, and sub-millisecond latency output for the quick visual control needed for fast driving. A race controller and Bluetooth remote joystick are used to coordinate different processing pipelines, and a low-cost ultra-wide-band (UWB) positioning system records trajectories. The modular design of Slasher can easily integrate additional features and sensors. In this paper, we show its application in a reflexive Convolutional Neural Network (CNN) steering controller trained by end-to-end learning. We present preliminary experiments in closed-loop indoor and outdoor trail driving.