{"title":"基于预测模型的磁传感器阵列摇摆误差校正","authors":"Zhiyu Lu;Li Yang;Bin Wang;Kun Wu;Yongxin Li;Xiaoping Zheng","doi":"10.1109/TGRS.2024.3494868","DOIUrl":null,"url":null,"abstract":"Magnetic sensor arrays are typically used to detect magnetic targets. Currently, research on sensor array calibration focuses on solving the problems of inconsistent sensitivity, zero offset, and non-orthogonality in individual sensors, and misalignment errors between sensors. However, in magnetic field detection, sensor arrays are usually mounted on a platform or carried as a handheld device and are prone to random swaying during the detection process, leading to changes in the attitude and position of the magnetic sensors, which in turn generates swaying errors. In this study, the source of swaying error in sensor arrays was first analyzed theoretically, and a swaying error model of a magnetic sensor was established. Second, a swaying error calibration method was proposed in combination with the classical prediction model—Gaussian process regression (GPR), backpropagation (BP) neural network, and support vector machine (SVM) in the field of artificial intelligence. The experimental results show that the prediction performance based on the BP neural network is the most outstanding. After correction, the relative error percentage of the swaying error in the magnetic field data decreased significantly from 165.50% to 9.35%, which is a significant improvement of the correction effect. In addition, we conducted model comparison experiments in different environments, and the results show that the BP model performs well in various environments, demonstrating its strong generalization ability and robustness. Finally, the distance error of the magnetic dipole position was significantly reduced after calibration, from 1.56, 1.12, and 2.50 m to 0.17, 0.04, and 0.04 m, respectively. Thus, the effectiveness of the calibration method was verified.","PeriodicalId":13213,"journal":{"name":"IEEE Transactions on Geoscience and Remote Sensing","volume":"62 ","pages":"1-11"},"PeriodicalIF":7.5000,"publicationDate":"2024-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Predictive Model-Based Correction of Magnetic Sensor Array Sway Errors\",\"authors\":\"Zhiyu Lu;Li Yang;Bin Wang;Kun Wu;Yongxin Li;Xiaoping Zheng\",\"doi\":\"10.1109/TGRS.2024.3494868\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Magnetic sensor arrays are typically used to detect magnetic targets. Currently, research on sensor array calibration focuses on solving the problems of inconsistent sensitivity, zero offset, and non-orthogonality in individual sensors, and misalignment errors between sensors. However, in magnetic field detection, sensor arrays are usually mounted on a platform or carried as a handheld device and are prone to random swaying during the detection process, leading to changes in the attitude and position of the magnetic sensors, which in turn generates swaying errors. In this study, the source of swaying error in sensor arrays was first analyzed theoretically, and a swaying error model of a magnetic sensor was established. Second, a swaying error calibration method was proposed in combination with the classical prediction model—Gaussian process regression (GPR), backpropagation (BP) neural network, and support vector machine (SVM) in the field of artificial intelligence. The experimental results show that the prediction performance based on the BP neural network is the most outstanding. After correction, the relative error percentage of the swaying error in the magnetic field data decreased significantly from 165.50% to 9.35%, which is a significant improvement of the correction effect. In addition, we conducted model comparison experiments in different environments, and the results show that the BP model performs well in various environments, demonstrating its strong generalization ability and robustness. Finally, the distance error of the magnetic dipole position was significantly reduced after calibration, from 1.56, 1.12, and 2.50 m to 0.17, 0.04, and 0.04 m, respectively. Thus, the effectiveness of the calibration method was verified.\",\"PeriodicalId\":13213,\"journal\":{\"name\":\"IEEE Transactions on Geoscience and Remote Sensing\",\"volume\":\"62 \",\"pages\":\"1-11\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2024-11-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Geoscience and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10750019/\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Geoscience and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10750019/","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
摘要
磁传感器阵列通常用于探测磁性目标。目前,有关传感器阵列校准的研究主要集中在解决单个传感器的灵敏度不一致、零点偏移、不正交以及传感器之间的不对准误差等问题。然而,在磁场探测中,传感器阵列通常安装在平台上或作为手持设备携带,在探测过程中容易发生随机摇摆,导致磁传感器的姿态和位置发生变化,进而产生摇摆误差。本研究首先从理论上分析了传感器阵列的摇摆误差来源,并建立了磁传感器的摇摆误差模型。其次,结合人工智能领域的经典预测模型--高斯过程回归(GPR)、反向传播(BP)神经网络和支持向量机(SVM),提出了摇摆误差校准方法。实验结果表明,基于 BP 神经网络的预测性能最为突出。经过修正后,磁场数据中摇摆误差的相对误差百分比从 165.50% 显著下降到 9.35%,修正效果显著提高。此外,我们还进行了不同环境下的模型对比实验,结果表明 BP 模型在各种环境下均表现良好,体现了其较强的泛化能力和鲁棒性。最后,校准后磁偶极子位置的距离误差明显减小,分别从 1.56 米、1.12 米和 2.50 米减小到 0.17 米、0.04 米和 0.04 米。因此,校准方法的有效性得到了验证。
Predictive Model-Based Correction of Magnetic Sensor Array Sway Errors
Magnetic sensor arrays are typically used to detect magnetic targets. Currently, research on sensor array calibration focuses on solving the problems of inconsistent sensitivity, zero offset, and non-orthogonality in individual sensors, and misalignment errors between sensors. However, in magnetic field detection, sensor arrays are usually mounted on a platform or carried as a handheld device and are prone to random swaying during the detection process, leading to changes in the attitude and position of the magnetic sensors, which in turn generates swaying errors. In this study, the source of swaying error in sensor arrays was first analyzed theoretically, and a swaying error model of a magnetic sensor was established. Second, a swaying error calibration method was proposed in combination with the classical prediction model—Gaussian process regression (GPR), backpropagation (BP) neural network, and support vector machine (SVM) in the field of artificial intelligence. The experimental results show that the prediction performance based on the BP neural network is the most outstanding. After correction, the relative error percentage of the swaying error in the magnetic field data decreased significantly from 165.50% to 9.35%, which is a significant improvement of the correction effect. In addition, we conducted model comparison experiments in different environments, and the results show that the BP model performs well in various environments, demonstrating its strong generalization ability and robustness. Finally, the distance error of the magnetic dipole position was significantly reduced after calibration, from 1.56, 1.12, and 2.50 m to 0.17, 0.04, and 0.04 m, respectively. Thus, the effectiveness of the calibration method was verified.
期刊介绍:
IEEE Transactions on Geoscience and Remote Sensing (TGRS) is a monthly publication that focuses on the theory, concepts, and techniques of science and engineering as applied to sensing the land, oceans, atmosphere, and space; and the processing, interpretation, and dissemination of this information.