{"title":"Kalman-Filter-Based Machine Vision for Controlling Free-Flying Unmanned Remote Vehicles","authors":"H. Alexander, A. Azarbayejani, H. J. Weigl","doi":"10.23919/ACC.1992.4792475","DOIUrl":null,"url":null,"abstract":"Automatic control of a robotic vehicle requires a navigation system to determine the vehicle's position and motion at each sampling interval for feedback corrections to be made. Human beings largely depend on vision for their own navigation, and it provides high-quality navigation data in a wide variety of environments. Machine-based vision systems have generally been too computationally expensive and slow, however, for use in real-time control systems. The system provided here achieves the required speed for real-time control through use of simple geometric models of the perceived target, dependence on tracking rather than object recognition, and reduction of the scene analysis task from a two-dimensional process to a set of one-dimensional scans through the image. The system is intended for application to a neutrally-buoyant vehicle called STAR that simulates a freely-flying, extravehicular space robot. The vision system will support development of autonomous and teleoperator control technologies for space robots, and the experimental results presented here result from preliminary target-pointing experiments with the STAR vehicle.","PeriodicalId":297258,"journal":{"name":"1992 American Control Conference","volume":"2013 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"1992 American Control Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/ACC.1992.4792475","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Automatic control of a robotic vehicle requires a navigation system to determine the vehicle's position and motion at each sampling interval for feedback corrections to be made. Human beings largely depend on vision for their own navigation, and it provides high-quality navigation data in a wide variety of environments. Machine-based vision systems have generally been too computationally expensive and slow, however, for use in real-time control systems. The system provided here achieves the required speed for real-time control through use of simple geometric models of the perceived target, dependence on tracking rather than object recognition, and reduction of the scene analysis task from a two-dimensional process to a set of one-dimensional scans through the image. The system is intended for application to a neutrally-buoyant vehicle called STAR that simulates a freely-flying, extravehicular space robot. The vision system will support development of autonomous and teleoperator control technologies for space robots, and the experimental results presented here result from preliminary target-pointing experiments with the STAR vehicle.