Neural network-based navigation filter for monocular pose and motion tracking of noncooperative spacecraft

IF 2.8 3区 地球科学 Q2 ASTRONOMY & ASTROPHYSICS Advances in Space Research Pub Date : 2025-02-01 DOI:10.1016/j.asr.2024.11.006
Zilong Chen, Haichao Gui, Rui Zhong
{"title":"Neural network-based navigation filter for monocular pose and motion tracking of noncooperative spacecraft","authors":"Zilong Chen,&nbsp;Haichao Gui,&nbsp;Rui Zhong","doi":"10.1016/j.asr.2024.11.006","DOIUrl":null,"url":null,"abstract":"<div><div>This paper presents a neural network-based multiplicative extended kalman filtering (MEKF) to track the pose (attitude and position) and 6D motion (angular velocity and linear velocity) of a known tumbling spacecraft using monocular camera for close-proximity operations. The proposed MEKF estimates pose and 6D motion of the target spacecraft relative to servicing spacecraft by combining dynamic and kinematic information with pseudo-measurement information provided by convolutional neural network (CNN). Specifically, a modified EfficientNet-B3 CNN architecture is constructed for recovering pose directly, while the 2D and 6D motion mapping equation is derived to recover 6D motion from monocular video stream. The former is recovered by using the CNN to learn the implicit mapping relationship between image and pose. The latter is recovered using motion mapping equation which combines pose pseudo-measurements with information of two consecutive image frames. The advantage of the proposed method is that it can provide full relative state pseudo-measurements including pose and 6D motion to the filter merely using a single monocular camera as sensor. Subsequently, the Failed Spacecraft Close-range Observation Dataset (FSCOD) is constructed to analyze the performance of the proposed method. The FSCOD consists of training and validation images independent of dynamic trajectories and test images induced by dynamic trajectories of tumbling spacecraft, both of which are generated by Unreal Engine 4 (UE4). Numerical simulations are conducted utilizing sequential images to evaluate the performance of the proposed MEKF and the modified EfficientNet-B3. The results show that the estimation accuracy of the modified EfficientNet-B3 is approximate to that of the UrsoNet, whereas its network parameters are only half of the UrsoNet. Moreover, the proposed MEKF has higher steady-state pose estimation accuracy than the UrsoNet baseline method and is robust to initial estimation errors and measurement noise.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"75 3","pages":"Pages 2908-2928"},"PeriodicalIF":2.8000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117724011232","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

This paper presents a neural network-based multiplicative extended kalman filtering (MEKF) to track the pose (attitude and position) and 6D motion (angular velocity and linear velocity) of a known tumbling spacecraft using monocular camera for close-proximity operations. The proposed MEKF estimates pose and 6D motion of the target spacecraft relative to servicing spacecraft by combining dynamic and kinematic information with pseudo-measurement information provided by convolutional neural network (CNN). Specifically, a modified EfficientNet-B3 CNN architecture is constructed for recovering pose directly, while the 2D and 6D motion mapping equation is derived to recover 6D motion from monocular video stream. The former is recovered by using the CNN to learn the implicit mapping relationship between image and pose. The latter is recovered using motion mapping equation which combines pose pseudo-measurements with information of two consecutive image frames. The advantage of the proposed method is that it can provide full relative state pseudo-measurements including pose and 6D motion to the filter merely using a single monocular camera as sensor. Subsequently, the Failed Spacecraft Close-range Observation Dataset (FSCOD) is constructed to analyze the performance of the proposed method. The FSCOD consists of training and validation images independent of dynamic trajectories and test images induced by dynamic trajectories of tumbling spacecraft, both of which are generated by Unreal Engine 4 (UE4). Numerical simulations are conducted utilizing sequential images to evaluate the performance of the proposed MEKF and the modified EfficientNet-B3. The results show that the estimation accuracy of the modified EfficientNet-B3 is approximate to that of the UrsoNet, whereas its network parameters are only half of the UrsoNet. Moreover, the proposed MEKF has higher steady-state pose estimation accuracy than the UrsoNet baseline method and is robust to initial estimation errors and measurement noise.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于神经网络的非合作航天器单目姿态与运动跟踪导航滤波器
本文提出了一种基于神经网络的乘法扩展卡尔曼滤波(MEKF)方法,用于利用单目摄像机进行近距离操作,跟踪已知翻滚航天器的姿态(姿态和位置)和6D运动(角速度和线速度)。该方法将动态和运动学信息与卷积神经网络(CNN)提供的伪测量信息相结合,估计目标航天器相对于服务航天器的位姿和6D运动。具体而言,构建了改进的effentnet - b3 CNN架构,用于直接恢复姿态,推导了2D和6D运动映射方程,用于从单目视频流中恢复6D运动。前者通过CNN学习图像与姿态之间的隐式映射关系来恢复。后者是利用结合姿态伪测量和连续两帧图像信息的运动映射方程来恢复的。该方法的优点是,它可以提供完整的相对状态伪测量,包括姿态和6D运动的过滤器,仅使用单个单目相机作为传感器。随后,构建失败航天器近距离观测数据集(FSCOD)来分析所提方法的性能。FSCOD由独立于动态轨迹的训练和验证图像和由翻滚航天器动态轨迹诱导的测试图像组成,两者均由虚幻引擎4 (UE4)生成。利用序列图像进行了数值模拟,以评估所提出的MEKF和改进的EfficientNet-B3的性能。结果表明,改进后的effentnet - b3的估计精度接近UrsoNet,而其网络参数仅为UrsoNet的一半。与UrsoNet基线方法相比,MEKF具有更高的稳态姿态估计精度,并且对初始估计误差和测量噪声具有较强的鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Advances in Space Research
Advances in Space Research 地学天文-地球科学综合
CiteScore
5.20
自引率
11.50%
发文量
800
审稿时长
5.8 months
期刊介绍: The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc. NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR). All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.
期刊最新文献
Integration of Jason-3, HY-2 series, and GPS observations for global ionospheric modeling with refined systematic biases Fast calculation of geomagnetic cutoff rigidity Ionospheric response of intense geomagnetic storms near the peak phase of Solar Cycle 25 at low mid-latitude Indian station, New Delhi Determination of reference point for Sheshan SLR telescope using self-driven prism system First study of polarization jet/SAID using onboard ionosonde on Ionosfera-M satellite
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1