{"title":"Principles and practice of real-time visual tracking for navigation and mapping","authors":"Darius Burschka, Gregory Hager","doi":"10.1109/ROSE.2004.1317605","DOIUrl":null,"url":null,"abstract":"We present examples of real-time mobile navigation systems and approaches to vision-based Simultaneous Localization and Mapping (SLAM) based on the second generation of our image processing library, XVision. Although the application-field of XVision is not limited to mobile navigation, the real-time requirements and limited computational resources on mobile systems provide a good testbed for it. We discuss the general architecture of the library and its modular processing pipeline, and we show how multiple tracking and low-level image processing primitives for color, texture and disparity can be combined to produce vision-guided navigation systems. The applications we discuss make use of XVision capabilities to solve the temporal correspondence problem by tracking an image feature in a given image domain.","PeriodicalId":142501,"journal":{"name":"International Workshop on Robot Sensing, 2004. ROSE 2004.","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Workshop on Robot Sensing, 2004. ROSE 2004.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROSE.2004.1317605","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
We present examples of real-time mobile navigation systems and approaches to vision-based Simultaneous Localization and Mapping (SLAM) based on the second generation of our image processing library, XVision. Although the application-field of XVision is not limited to mobile navigation, the real-time requirements and limited computational resources on mobile systems provide a good testbed for it. We discuss the general architecture of the library and its modular processing pipeline, and we show how multiple tracking and low-level image processing primitives for color, texture and disparity can be combined to produce vision-guided navigation systems. The applications we discuss make use of XVision capabilities to solve the temporal correspondence problem by tracking an image feature in a given image domain.