Chuankai Liu, Xiaoxue Wang, Jia Wang, Jianhua Su, Baofeng Wang, G. Tang, Xiangyan Guo
{"title":"基于单目视觉的空间飞行目标快速跟踪与精确位姿估计","authors":"Chuankai Liu, Xiaoxue Wang, Jia Wang, Jianhua Su, Baofeng Wang, G. Tang, Xiangyan Guo","doi":"10.1109/CGNCC.2016.7829113","DOIUrl":null,"url":null,"abstract":"Autonomous rendezvous and robotic capturing of flying targets is widely used in many space shuttle missions and it is very crucial for on-orbit service. To perform this task, tracking and pose estimation of the flying targets is usually considered as one of the most important issues, needing to be addressed among the whole process. Taking account of the specificity of space environment such as lighting and the continuity of the rendezvous process, in this paper, we design a fast tracking and accurate pose estimation algorithm for cooperative luminaries (or retro-reflectors), to guide a safe and reliable capturing operation. Different from available target tracking or searching method, this paper defines a new comparability measure function for target appearance and utilizes the continuity of target moving to limit the target search in a predicted range of the image, which accelerates the search process. Meanwhile, the projective shape changes of each luminary due to rotation are also considered to help improve the accuracy of the target extraction. With the positions of multiple target spots obtained from the image, least square method is applied to adjust the spatial pose results iteratively, and finally accurate pose estimation is achieved. Experiments on the simulated space target, which consists of six LEDs, validate the proposed method.","PeriodicalId":426650,"journal":{"name":"2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC)","volume":"21 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Fast tracking and accurate pose estimation of space flying target based on monocular vision\",\"authors\":\"Chuankai Liu, Xiaoxue Wang, Jia Wang, Jianhua Su, Baofeng Wang, G. Tang, Xiangyan Guo\",\"doi\":\"10.1109/CGNCC.2016.7829113\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Autonomous rendezvous and robotic capturing of flying targets is widely used in many space shuttle missions and it is very crucial for on-orbit service. To perform this task, tracking and pose estimation of the flying targets is usually considered as one of the most important issues, needing to be addressed among the whole process. Taking account of the specificity of space environment such as lighting and the continuity of the rendezvous process, in this paper, we design a fast tracking and accurate pose estimation algorithm for cooperative luminaries (or retro-reflectors), to guide a safe and reliable capturing operation. Different from available target tracking or searching method, this paper defines a new comparability measure function for target appearance and utilizes the continuity of target moving to limit the target search in a predicted range of the image, which accelerates the search process. Meanwhile, the projective shape changes of each luminary due to rotation are also considered to help improve the accuracy of the target extraction. With the positions of multiple target spots obtained from the image, least square method is applied to adjust the spatial pose results iteratively, and finally accurate pose estimation is achieved. Experiments on the simulated space target, which consists of six LEDs, validate the proposed method.\",\"PeriodicalId\":426650,\"journal\":{\"name\":\"2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC)\",\"volume\":\"21 4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CGNCC.2016.7829113\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CGNCC.2016.7829113","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fast tracking and accurate pose estimation of space flying target based on monocular vision
Autonomous rendezvous and robotic capturing of flying targets is widely used in many space shuttle missions and it is very crucial for on-orbit service. To perform this task, tracking and pose estimation of the flying targets is usually considered as one of the most important issues, needing to be addressed among the whole process. Taking account of the specificity of space environment such as lighting and the continuity of the rendezvous process, in this paper, we design a fast tracking and accurate pose estimation algorithm for cooperative luminaries (or retro-reflectors), to guide a safe and reliable capturing operation. Different from available target tracking or searching method, this paper defines a new comparability measure function for target appearance and utilizes the continuity of target moving to limit the target search in a predicted range of the image, which accelerates the search process. Meanwhile, the projective shape changes of each luminary due to rotation are also considered to help improve the accuracy of the target extraction. With the positions of multiple target spots obtained from the image, least square method is applied to adjust the spatial pose results iteratively, and finally accurate pose estimation is achieved. Experiments on the simulated space target, which consists of six LEDs, validate the proposed method.