Matt Brown, Keith Fieldhouse, E. Swears, Paul Tunison, Adam Romlein, A. Hoogs
{"title":"Multi-Modal Detection Fusion on a Mobile UGV for Wide-Area, Long-Range Surveillance","authors":"Matt Brown, Keith Fieldhouse, E. Swears, Paul Tunison, Adam Romlein, A. Hoogs","doi":"10.1109/WACV.2019.00207","DOIUrl":null,"url":null,"abstract":"We introduce a self-contained, mobile surveillance system designed to remotely detect and track people in real time, at long ranges, and over a wide field of view in cluttered urban and natural settings. The system is integrated with an unmanned ground vehicle, which hosts an array of four IR and four high-resolution RGB cameras, navigational sensors, and onboard processing computers. High-confidence, low-false-alarm-rate person tracks are produced by fusing motion detections and single-frame CNN person detections between co-registered RGB and IR video streams. Processing speeds are increased by using semantic scene segmentation and a tiered inference scheme to focus processing on the most salient regions of the 43° x 7.8° composite field of view. The system autonomously produces alerts of human presence and movement within the field of view, which are disseminated over a radio network and remotely viewed on a tablet computer. We present an ablation study quantifying the benefits that multi-sensor, multi-detector fusion brings to the problem of detecting people in challenging outdoor environments with shadows, occlusions, clutter, and variable weather conditions.","PeriodicalId":436637,"journal":{"name":"2019 IEEE Winter Conference on Applications of Computer Vision (WACV)","volume":"202 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Winter Conference on Applications of Computer Vision (WACV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WACV.2019.00207","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
We introduce a self-contained, mobile surveillance system designed to remotely detect and track people in real time, at long ranges, and over a wide field of view in cluttered urban and natural settings. The system is integrated with an unmanned ground vehicle, which hosts an array of four IR and four high-resolution RGB cameras, navigational sensors, and onboard processing computers. High-confidence, low-false-alarm-rate person tracks are produced by fusing motion detections and single-frame CNN person detections between co-registered RGB and IR video streams. Processing speeds are increased by using semantic scene segmentation and a tiered inference scheme to focus processing on the most salient regions of the 43° x 7.8° composite field of view. The system autonomously produces alerts of human presence and movement within the field of view, which are disseminated over a radio network and remotely viewed on a tablet computer. We present an ablation study quantifying the benefits that multi-sensor, multi-detector fusion brings to the problem of detecting people in challenging outdoor environments with shadows, occlusions, clutter, and variable weather conditions.