Hongkai Wen, Yiran Shen, Savvas Papaioannou, W. Churchill, A. Trigoni, P. Newman
{"title":"自主地面车辆的机会无线电辅助导航","authors":"Hongkai Wen, Yiran Shen, Savvas Papaioannou, W. Churchill, A. Trigoni, P. Newman","doi":"10.1109/DCOSS.2015.22","DOIUrl":null,"url":null,"abstract":"Navigating autonomous ground vehicles with visual sensors has many advantages - it does not rely on global maps, yet is accurate and reliable even in GPS-denied environments. However, due to the limitation of the camera field of view, one typically has to record a large number of visual experiences for practical navigation. In this paper, we explore new avenues in linking together visual experiences, by opportunistically harvesting and sharing a variety of radio signals emitted by surrounding stationary access points and mobile devices. We propose a novel navigation approach, which exploits side-channel information of co-location to thread up visually-separated experiences with short exploration phases. The proposed approach empowers users to trade travel time for manual navigation effort, allowing them to choose the itinerary that best serves their needs. We evaluate the proposed approach with data collected from a typical urban area, and show that it achieves much better navigation performance in both reach ability and cost, comparing with the state of the arts that only use visual information.","PeriodicalId":332746,"journal":{"name":"2015 International Conference on Distributed Computing in Sensor Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Opportunistic Radio Assisted Navigation for Autonomous Ground Vehicles\",\"authors\":\"Hongkai Wen, Yiran Shen, Savvas Papaioannou, W. Churchill, A. Trigoni, P. Newman\",\"doi\":\"10.1109/DCOSS.2015.22\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Navigating autonomous ground vehicles with visual sensors has many advantages - it does not rely on global maps, yet is accurate and reliable even in GPS-denied environments. However, due to the limitation of the camera field of view, one typically has to record a large number of visual experiences for practical navigation. In this paper, we explore new avenues in linking together visual experiences, by opportunistically harvesting and sharing a variety of radio signals emitted by surrounding stationary access points and mobile devices. We propose a novel navigation approach, which exploits side-channel information of co-location to thread up visually-separated experiences with short exploration phases. The proposed approach empowers users to trade travel time for manual navigation effort, allowing them to choose the itinerary that best serves their needs. We evaluate the proposed approach with data collected from a typical urban area, and show that it achieves much better navigation performance in both reach ability and cost, comparing with the state of the arts that only use visual information.\",\"PeriodicalId\":332746,\"journal\":{\"name\":\"2015 International Conference on Distributed Computing in Sensor Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-06-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 International Conference on Distributed Computing in Sensor Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DCOSS.2015.22\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference on Distributed Computing in Sensor Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCOSS.2015.22","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Opportunistic Radio Assisted Navigation for Autonomous Ground Vehicles
Navigating autonomous ground vehicles with visual sensors has many advantages - it does not rely on global maps, yet is accurate and reliable even in GPS-denied environments. However, due to the limitation of the camera field of view, one typically has to record a large number of visual experiences for practical navigation. In this paper, we explore new avenues in linking together visual experiences, by opportunistically harvesting and sharing a variety of radio signals emitted by surrounding stationary access points and mobile devices. We propose a novel navigation approach, which exploits side-channel information of co-location to thread up visually-separated experiences with short exploration phases. The proposed approach empowers users to trade travel time for manual navigation effort, allowing them to choose the itinerary that best serves their needs. We evaluate the proposed approach with data collected from a typical urban area, and show that it achieves much better navigation performance in both reach ability and cost, comparing with the state of the arts that only use visual information.