{"title":"Neural network-based navigation filter for monocular pose and motion tracking of noncooperative spacecraft","authors":"Zilong Chen, Haichao Gui, Rui Zhong","doi":"10.1016/j.asr.2024.11.006","DOIUrl":null,"url":null,"abstract":"<div><div>This paper presents a neural network-based multiplicative extended kalman filtering (MEKF) to track the pose (attitude and position) and 6D motion (angular velocity and linear velocity) of a known tumbling spacecraft using monocular camera for close-proximity operations. The proposed MEKF estimates pose and 6D motion of the target spacecraft relative to servicing spacecraft by combining dynamic and kinematic information with pseudo-measurement information provided by convolutional neural network (CNN). Specifically, a modified EfficientNet-B3 CNN architecture is constructed for recovering pose directly, while the 2D and 6D motion mapping equation is derived to recover 6D motion from monocular video stream. The former is recovered by using the CNN to learn the implicit mapping relationship between image and pose. The latter is recovered using motion mapping equation which combines pose pseudo-measurements with information of two consecutive image frames. The advantage of the proposed method is that it can provide full relative state pseudo-measurements including pose and 6D motion to the filter merely using a single monocular camera as sensor. Subsequently, the Failed Spacecraft Close-range Observation Dataset (FSCOD) is constructed to analyze the performance of the proposed method. The FSCOD consists of training and validation images independent of dynamic trajectories and test images induced by dynamic trajectories of tumbling spacecraft, both of which are generated by Unreal Engine 4 (UE4). Numerical simulations are conducted utilizing sequential images to evaluate the performance of the proposed MEKF and the modified EfficientNet-B3. The results show that the estimation accuracy of the modified EfficientNet-B3 is approximate to that of the UrsoNet, whereas its network parameters are only half of the UrsoNet. Moreover, the proposed MEKF has higher steady-state pose estimation accuracy than the UrsoNet baseline method and is robust to initial estimation errors and measurement noise.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"75 3","pages":"Pages 2908-2928"},"PeriodicalIF":2.8000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117724011232","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
This paper presents a neural network-based multiplicative extended kalman filtering (MEKF) to track the pose (attitude and position) and 6D motion (angular velocity and linear velocity) of a known tumbling spacecraft using monocular camera for close-proximity operations. The proposed MEKF estimates pose and 6D motion of the target spacecraft relative to servicing spacecraft by combining dynamic and kinematic information with pseudo-measurement information provided by convolutional neural network (CNN). Specifically, a modified EfficientNet-B3 CNN architecture is constructed for recovering pose directly, while the 2D and 6D motion mapping equation is derived to recover 6D motion from monocular video stream. The former is recovered by using the CNN to learn the implicit mapping relationship between image and pose. The latter is recovered using motion mapping equation which combines pose pseudo-measurements with information of two consecutive image frames. The advantage of the proposed method is that it can provide full relative state pseudo-measurements including pose and 6D motion to the filter merely using a single monocular camera as sensor. Subsequently, the Failed Spacecraft Close-range Observation Dataset (FSCOD) is constructed to analyze the performance of the proposed method. The FSCOD consists of training and validation images independent of dynamic trajectories and test images induced by dynamic trajectories of tumbling spacecraft, both of which are generated by Unreal Engine 4 (UE4). Numerical simulations are conducted utilizing sequential images to evaluate the performance of the proposed MEKF and the modified EfficientNet-B3. The results show that the estimation accuracy of the modified EfficientNet-B3 is approximate to that of the UrsoNet, whereas its network parameters are only half of the UrsoNet. Moreover, the proposed MEKF has higher steady-state pose estimation accuracy than the UrsoNet baseline method and is robust to initial estimation errors and measurement noise.
期刊介绍:
The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc.
NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR).
All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.