首页 > 最新文献

Journal of Field Robotics最新文献

英文 中文
Cover Image, Volume 42, Number 8, December 2025 封面图片,42卷,第8期,2025年12月
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-11-17 DOI: 10.1002/rob.70115
Sehwa Chun, Hiroki Yokohata, Kenji Ohkuma, Shouhei Ito, Shinichiro Hirabayashi, Toshihiro Maki

The cover image is based on the article Tracking mooring lines of floating structures by an autonomous underwater vehicle by Sehwa Chun et al., 10.1002/rob.70076.

封面图像基于Sehwa Chun et al., 10.1002/rob.70076的文章《通过自主水下航行器跟踪浮动结构的系泊线》。
{"title":"Cover Image, Volume 42, Number 8, December 2025","authors":"Sehwa Chun,&nbsp;Hiroki Yokohata,&nbsp;Kenji Ohkuma,&nbsp;Shouhei Ito,&nbsp;Shinichiro Hirabayashi,&nbsp;Toshihiro Maki","doi":"10.1002/rob.70115","DOIUrl":"https://doi.org/10.1002/rob.70115","url":null,"abstract":"<p>The cover image is based on the article <i>Tracking mooring lines of floating structures by an autonomous underwater vehicle</i> by Sehwa Chun et al., 10.1002/rob.70076.\u0000\u0000 <figure>\u0000 <div><picture>\u0000 <source></source></picture><p></p>\u0000 </div>\u0000 </figure></p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"42 8","pages":""},"PeriodicalIF":5.2,"publicationDate":"2025-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.70115","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145530246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Cover Image, Volume 42, Number 7, October 2025 封面图片,42卷,第7期,2025年10月
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-11-06 DOI: 10.1002/rob.70071

The image illustrates a jet-powered personal aerial system (PAV) performing vertical take-off and landing within a post-disaster urban environment. The system integrates five micro turbojet engines and a dual-degree-of-freedom vector control mechanism to achieve high maneuverability, stability, and fault tolerance. This work demonstrates the feasibility of a compact, human-scale VTOL platform capable of safe operation in complex field conditions, with potential applications in rescue and rapid response missions.

图为在灾后城市环境中执行垂直起降的喷气动力个人空中系统(PAV)。该系统集成了5台微型涡轮喷气发动机和一个双自由度矢量控制机构,实现了高机动性、稳定性和容错性。这项工作证明了一个紧凑的、人性化的垂直起降平台的可行性,该平台能够在复杂的野外条件下安全运行,在救援和快速响应任务中具有潜在的应用前景。
{"title":"Cover Image, Volume 42, Number 7, October 2025","authors":"","doi":"10.1002/rob.70071","DOIUrl":"https://doi.org/10.1002/rob.70071","url":null,"abstract":"<p>The image illustrates a jet-powered personal aerial system (PAV) performing vertical take-off and landing within a post-disaster urban environment. The system integrates five micro turbojet engines and a dual-degree-of-freedom vector control mechanism to achieve high maneuverability, stability, and fault tolerance. This work demonstrates the feasibility of a compact, human-scale VTOL platform capable of safe operation in complex field conditions, with potential applications in rescue and rapid response missions.\u0000\u0000 <figure>\u0000 <div><picture>\u0000 <source></source></picture><p></p>\u0000 </div>\u0000 </figure></p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"42 7","pages":""},"PeriodicalIF":5.2,"publicationDate":"2025-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.70071","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145469773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Tracking Mooring Lines of Floating Structures by an Autonomous Underwater Vehicle 自主水下航行器对浮式结构系泊线的跟踪
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-30 DOI: 10.1002/rob.70076
Sehwa Chun, Hiroki Yokohata, Kenji Ohkuma, Shouhei Ito, Shinichiro Hirabayashi, Toshihiro Maki

This study presents a novel method for tracking mooring lines of Floating Offshore Wind Turbines (FOWTs) using an Autonomous Underwater Vehicle (AUV) equipped with a tilt-controlled Multibeam Imaging Sonar (MBS). The proposed approach enables the AUV to estimate the 3D positions of mooring lines and safely track them in real-time, overcoming the limitations of traditional Remotely Operated Vehicle (ROV)-based inspections. By utilizing the tilt-controlled MBS and a pre-trained You Only Look Once (YOLO) model, the AUV identifies the mooring lines within sonar imagery and dynamically adjusts its velocities to maintain a safe distance during the inspection. A re-navigation method using the Rauch-Tung-Striebel (RTS) smoother enhances the AUV's localization accuracy by correcting its trajectory using post-processed data from sensors such as the Doppler Velocity Log (DVL), Super Short Baseline (SSBL) system, and Global Navigation Satellite System (GNSS). Additionally, reconstruction with catenary curve fitting is employed to estimate the mooring line's catenary parameters, offering insights into its potential deformation. The approach was validated using a hovering-type AUV, Tri-TON through both tank experiments and a sea experiment at an FOWT Hibiki in Kitakyushu, Japan. In the sea experiment, the AUV successfully tracked the mooring lines for 423 s, demonstrating its ability to estimate the position and catenary parameters of the mooring lines. The experimental results highlight areas for future improvement, particularly in enhancing localization accuracy, developing robust control algorithms, and expanding the analysis of mooring line conditions. This method lays the groundwork for future advancements in automated mooring line inspections and enables the integration of additional techniques, such as visual inspection.

本研究提出了一种利用配备倾斜控制多波束成像声纳(MBS)的自主水下航行器(AUV)跟踪浮式海上风力涡轮机(FOWTs)系泊线的新方法。提出的方法使AUV能够估计系泊线的三维位置并实时安全跟踪,克服了传统的基于远程操作车辆(ROV)的检测的局限性。通过使用倾斜控制的MBS和预先训练的You Only Look Once (YOLO)模型,AUV可以识别声纳图像中的系泊线,并动态调整其速度,以在检查期间保持安全距离。使用ruch - tung - striebel (RTS)平滑器的重新导航方法通过使用来自多普勒速度日志(DVL)、超短基线(SSBL)系统和全球导航卫星系统(GNSS)等传感器的后处理数据校正其轨迹来提高AUV的定位精度。此外,利用接触网曲线拟合进行重建来估计系泊线的接触网参数,从而深入了解其潜在的变形。该方法在日本北九州的FOWT Hibiki进行了坦克实验和海上实验,并使用悬停型AUV Tri-TON进行了验证。在海上实验中,AUV成功地跟踪了系泊线423 s,证明了其估计系泊线位置和悬链线参数的能力。实验结果强调了未来需要改进的领域,特别是在提高定位精度、开发鲁棒控制算法和扩展系泊线条件分析方面。这种方法为未来自动系泊线检查的发展奠定了基础,并能够集成其他技术,如目视检查。
{"title":"Tracking Mooring Lines of Floating Structures by an Autonomous Underwater Vehicle","authors":"Sehwa Chun,&nbsp;Hiroki Yokohata,&nbsp;Kenji Ohkuma,&nbsp;Shouhei Ito,&nbsp;Shinichiro Hirabayashi,&nbsp;Toshihiro Maki","doi":"10.1002/rob.70076","DOIUrl":"https://doi.org/10.1002/rob.70076","url":null,"abstract":"<p>This study presents a novel method for tracking mooring lines of Floating Offshore Wind Turbines (FOWTs) using an Autonomous Underwater Vehicle (AUV) equipped with a tilt-controlled Multibeam Imaging Sonar (MBS). The proposed approach enables the AUV to estimate the 3D positions of mooring lines and safely track them in real-time, overcoming the limitations of traditional Remotely Operated Vehicle (ROV)-based inspections. By utilizing the tilt-controlled MBS and a pre-trained You Only Look Once (YOLO) model, the AUV identifies the mooring lines within sonar imagery and dynamically adjusts its velocities to maintain a safe distance during the inspection. A re-navigation method using the Rauch-Tung-Striebel (RTS) smoother enhances the AUV's localization accuracy by correcting its trajectory using post-processed data from sensors such as the Doppler Velocity Log (DVL), Super Short Baseline (SSBL) system, and Global Navigation Satellite System (GNSS). Additionally, reconstruction with catenary curve fitting is employed to estimate the mooring line's catenary parameters, offering insights into its potential deformation. The approach was validated using a hovering-type AUV, Tri-TON through both tank experiments and a sea experiment at an FOWT Hibiki in Kitakyushu, Japan. In the sea experiment, the AUV successfully tracked the mooring lines for 423 s, demonstrating its ability to estimate the position and catenary parameters of the mooring lines. The experimental results highlight areas for future improvement, particularly in enhancing localization accuracy, developing robust control algorithms, and expanding the analysis of mooring line conditions. This method lays the groundwork for future advancements in automated mooring line inspections and enables the integration of additional techniques, such as visual inspection.</p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"42 8","pages":"4589-4608"},"PeriodicalIF":5.2,"publicationDate":"2025-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.70076","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145530222","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Visual–Acoustic-Based Framework for Online Inspection of Submerged Structures Using Autonomous Underwater Vehicles 基于视觉声学的自主水下航行器水下结构在线检测框架
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-30 DOI: 10.1002/rob.70075
Simone Tani, Francesco Ruscio, Andrea Caiti, Riccardo Costanzi

Underwater inspections of critical maritime infrastructures are still predominantly performed by human divers, exposing them to safety risks and yielding limited accuracy and repeatability. Autonomous Underwater Vehicles (AUVs) offer a promising alternative by removing humans from hazardous environments and enabling systematic, repeatable inspection operations. However, current AUV systems lack the necessary autonomy and typically rely on prior knowledge of the environment, limiting their applicability in real-world scenarios. This study presents a visual–acoustic-based framework aimed at overcoming these limitations and moving a step closer to fully autonomous inspection operations using AUVs. Designed for cost-effective deployment on vehicles equipped with a minimal sensor suite—including a stereo camera, an acoustic range sensor, an Inertial Measurement Unit with magnetometers, a pressure sensor, and a Global Positioning System (used only on the surface)—the framework enables inspection of unknown underwater structures without human intervention. The main contribution lies in the integration of perception and navigation into a unified architecture, allowing the AUV to leverage the exteroceptive sensor not only for scene understanding but also to support real-time control and mission adaptation. Perception data are combined with proprioceptive observations to adapt motion based on the environment, enabling autonomous management of the inspection mission and navigation with respect to the target. Furthermore, a mission manager coordinates all phases of the operation, from initial approach to structure-relative navigation and visual data acquisition. The proposed solution was validated through a sea trial, during which an AUV autonomously inspected a harbor pier. The framework computed control actions in quasi-real-time to maintain a predefined safety distance, inspection velocity, and payload orientation orthogonal to the scene. These outputs were used online as feedback within the AUV's control loop. The underwater robot completed the inspection, maintaining mission references and ensuring effective target coverage, good-quality optical data, and consistent three-dimensional reconstruction. Overall, this experimental validation demonstrates the feasibility of the proposed framework and marks a significant milestone toward the deployment of fully autonomous AUVs for real-world underwater inspection missions, even in the absence of prior knowledge about the structure.

{"title":"Visual–Acoustic-Based Framework for Online Inspection of Submerged Structures Using Autonomous Underwater Vehicles","authors":"Simone Tani,&nbsp;Francesco Ruscio,&nbsp;Andrea Caiti,&nbsp;Riccardo Costanzi","doi":"10.1002/rob.70075","DOIUrl":"https://doi.org/10.1002/rob.70075","url":null,"abstract":"<div>\u0000 \u0000 <p>Underwater inspections of critical maritime infrastructures are still predominantly performed by human divers, exposing them to safety risks and yielding limited accuracy and repeatability. Autonomous Underwater Vehicles (AUVs) offer a promising alternative by removing humans from hazardous environments and enabling systematic, repeatable inspection operations. However, current AUV systems lack the necessary autonomy and typically rely on prior knowledge of the environment, limiting their applicability in real-world scenarios. This study presents a visual–acoustic-based framework aimed at overcoming these limitations and moving a step closer to fully autonomous inspection operations using AUVs. Designed for cost-effective deployment on vehicles equipped with a minimal sensor suite—including a stereo camera, an acoustic range sensor, an Inertial Measurement Unit with magnetometers, a pressure sensor, and a Global Positioning System (used only on the surface)—the framework enables inspection of unknown underwater structures without human intervention. The main contribution lies in the integration of perception and navigation into a unified architecture, allowing the AUV to leverage the exteroceptive sensor not only for scene understanding but also to support real-time control and mission adaptation. Perception data are combined with proprioceptive observations to adapt motion based on the environment, enabling autonomous management of the inspection mission and navigation with respect to the target. Furthermore, a mission manager coordinates all phases of the operation, from initial approach to structure-relative navigation and visual data acquisition. The proposed solution was validated through a sea trial, during which an AUV autonomously inspected a harbor pier. The framework computed control actions in quasi-real-time to maintain a predefined safety distance, inspection velocity, and payload orientation orthogonal to the scene. These outputs were used online as feedback within the AUV's control loop. The underwater robot completed the inspection, maintaining mission references and ensuring effective target coverage, good-quality optical data, and consistent three-dimensional reconstruction. Overall, this experimental validation demonstrates the feasibility of the proposed framework and marks a significant milestone toward the deployment of fully autonomous AUVs for real-world underwater inspection missions, even in the absence of prior knowledge about the structure.</p>\u0000 </div>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"43 2","pages":"1152-1177"},"PeriodicalIF":5.2,"publicationDate":"2025-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146136782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Friction Shock Absorbers and Reverse Thrust for Fast Multirotor Landing on High-Speed Vehicles 高速飞行器多旋翼快速着陆的摩擦减震器和逆推力
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-24 DOI: 10.1002/rob.70069
Isaac Tunney, John Bass, Alexis Lussier Desbiens

Typical landing gears of small uninhabited aerial vehicles (UAV) limit their capability to land on vehicles moving at more than 20–50 km/h due to high drag forces, high pitch angles and potentially high relative horizontal velocities. To enable landing at higher speeds, a combination of lightweight friction shock absorbers and reverse thrust was developed. This allows for rapid descents (i.e., 3 m/s) toward the vehicle while leveling at the last instant. Simulations show that the proposed system is (1) more robust at higher descent speeds contrary to traditional configurations, (2) can touchdown at almost any time during the leveling maneuver, thus reducing the timing constraints, and (3) is robust to many environmental, design and operational factors, maintaining a success rate above 80% up to 100 km/h. Compared to standard multirotors, this approach expands the possible state envelope at touchdown by a factor of 60. A total of 38 experimental trials were conducted where a drone successfully landed on a pickup truck moving at speeds ranging from 10 to 110 km/h. The increased touchdown envelope was shown to improve the multirotors' robustness to external disturbances such as winds and wind gusts, sensor errors and unpredictable motion of the ground vehicle. The increased landing capabilities also expand the flight envelope at the start of the leveling maneuver by a factor of 38 compared to a standard multirotor, thereby allowing the drone to fly in tougher conditions and initiate its leveling maneuver from a broader range of altitudes, vertical and horizontal velocities, as well as pitch angles and rates.

{"title":"Friction Shock Absorbers and Reverse Thrust for Fast Multirotor Landing on High-Speed Vehicles","authors":"Isaac Tunney,&nbsp;John Bass,&nbsp;Alexis Lussier Desbiens","doi":"10.1002/rob.70069","DOIUrl":"https://doi.org/10.1002/rob.70069","url":null,"abstract":"<p>Typical landing gears of small uninhabited aerial vehicles (UAV) limit their capability to land on vehicles moving at more than 20–50 km/h due to high drag forces, high pitch angles and potentially high relative horizontal velocities. To enable landing at higher speeds, a combination of lightweight friction shock absorbers and reverse thrust was developed. This allows for rapid descents (i.e., 3 m/s) toward the vehicle while leveling at the last instant. Simulations show that the proposed system is (1) more robust at higher descent speeds contrary to traditional configurations, (2) can touchdown at almost any time during the leveling maneuver, thus reducing the timing constraints, and (3) is robust to many environmental, design and operational factors, maintaining a success rate above 80% up to 100 km/h. Compared to standard multirotors, this approach expands the possible state envelope at touchdown by a factor of 60. A total of 38 experimental trials were conducted where a drone successfully landed on a pickup truck moving at speeds ranging from 10 to 110 km/h. The increased touchdown envelope was shown to improve the multirotors' robustness to external disturbances such as winds and wind gusts, sensor errors and unpredictable motion of the ground vehicle. The increased landing capabilities also expand the flight envelope at the start of the leveling maneuver by a factor of 38 compared to a standard multirotor, thereby allowing the drone to fly in tougher conditions and initiate its leveling maneuver from a broader range of altitudes, vertical and horizontal velocities, as well as pitch angles and rates.</p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"43 2","pages":"1068-1090"},"PeriodicalIF":5.2,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.70069","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146136715","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of a Machine Vision-Based Walk-Behind Cotton Fertilizer Applicator 基于机器视觉的后走式棉花施肥机的研制
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-21 DOI: 10.1002/rob.70088
Arjun Chouriya, Peeyush Soni, Ajay Kumar Patel, Vijay Mahore

A vision-based, walk-behind fertilizer applicator for cotton crops was designed to address the limitations of conventional systems. Traditional applicators face multiple operational issues such as high discharge rates, uneven fertilizer distribution, input wastage, and increased weed infestation. Field evaluations were carried out to measure parameters including missing plant index, uniformity of application, precision, fertilizer savings, and overall field performance. The applicator integrates three main systems: a cotton detection unit, an electronic controller, and an automatic fertilizing mechanism. The cotton detection unit captures and processes images to identify cotton plants while excluding weeds; the control system governs the stepper motor of the fertilization unit based on cotton detection signals; and the fertilizing mechanism dispenses micro-granular fertilizer near cotton plants by activating a metering unit through the stepper motor. The detection unit showed high accuracy with bounding box loss at 2.14% and object loss at 1.56% over 100 training epochs, confirming effective real-time detection. Small variations in discharge were observed, with Urea ranging between 5.59 and 8.9% and DAP between 6.89% and 9.25%. The left-to-right distribution ratios for Urea and DAP ranged from 0.89 to 1.05 and 0.90 to 1.04, respectively, with average values of 0.95 (Urea) and 0.97 (DAP). The system demonstrated high precision, delivering fertilizer with an accuracy between 90 and 93 percent, and an overall average of 91%. Theoretical and actual field capacities were in the range of 0.07–0.14 and 0.045–0.078 ha/h, respectively, while field efficiency (FE) varied from 69.39% at 0.5 km/h to 58.21% at 1 km/h. This study effectively developed a precision applicator that ensures targeted fertilization, adaptive delivery, reduced labor and time, better weed control, and promotes the use of modern technology in farming.

{"title":"Development of a Machine Vision-Based Walk-Behind Cotton Fertilizer Applicator","authors":"Arjun Chouriya,&nbsp;Peeyush Soni,&nbsp;Ajay Kumar Patel,&nbsp;Vijay Mahore","doi":"10.1002/rob.70088","DOIUrl":"https://doi.org/10.1002/rob.70088","url":null,"abstract":"<div>\u0000 \u0000 <p>A vision-based, walk-behind fertilizer applicator for cotton crops was designed to address the limitations of conventional systems. Traditional applicators face multiple operational issues such as high discharge rates, uneven fertilizer distribution, input wastage, and increased weed infestation. Field evaluations were carried out to measure parameters including missing plant index, uniformity of application, precision, fertilizer savings, and overall field performance. The applicator integrates three main systems: a cotton detection unit, an electronic controller, and an automatic fertilizing mechanism. The cotton detection unit captures and processes images to identify cotton plants while excluding weeds; the control system governs the stepper motor of the fertilization unit based on cotton detection signals; and the fertilizing mechanism dispenses micro-granular fertilizer near cotton plants by activating a metering unit through the stepper motor. The detection unit showed high accuracy with bounding box loss at 2.14% and object loss at 1.56% over 100 training epochs, confirming effective real-time detection. Small variations in discharge were observed, with Urea ranging between 5.59 and 8.9% and DAP between 6.89% and 9.25%. The left-to-right distribution ratios for Urea and DAP ranged from 0.89 to 1.05 and 0.90 to 1.04, respectively, with average values of 0.95 (Urea) and 0.97 (DAP). The system demonstrated high precision, delivering fertilizer with an accuracy between 90 and 93 percent, and an overall average of 91%. Theoretical and actual field capacities were in the range of 0.07–0.14 and 0.045–0.078 ha/h, respectively, while field efficiency (FE) varied from 69.39% at 0.5 km/h to 58.21% at 1 km/h. This study effectively developed a precision applicator that ensures targeted fertilization, adaptive delivery, reduced labor and time, better weed control, and promotes the use of modern technology in farming.</p>\u0000 </div>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"43 2","pages":"995-1007"},"PeriodicalIF":5.2,"publicationDate":"2025-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146136700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Robust Transformer–Based Error Compensation Method for Gyroscope of IMUs 基于鲁棒变压器的imu陀螺仪误差补偿方法
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-18 DOI: 10.1002/rob.70082
Xin Ye, Weijia Xing, Zhilou Yu, Jichen Chen, Qingyue Ma

Inertial Measurement Units (IMUs), comprising gyroscopes and accelerometers, are fundamental for motion estimation in navigation and robotics. However, their performance is often degraded by nonlinear and time-varying errors, such as bias drift, scale-factor deviations, and sensor noise. Traditional compensation methods based on linear assumptions or static models struggle to address the dynamic and correlated nature of these errors, limiting real-time calibration robustness. To address this challenge, we propose a transformer-based framework for gyroscope error compensation, which dynamically models temporal dependencies and nonlinear error characteristics using self-attention mechanisms. Our approach incorporates a sliding window to exploit historical sensor data and applies a geometrically constrained loss function defined on the SO(3) Lie group, ensuring physically consistent orientation estimates. Comprehensive experiments on the public EuRoC and Technical University of Munich data sets demonstrate that our method achieves a mean orientation error of 1.56°, outperforming state-of-the-art approaches including robust 3-D orientation estimation with a single particular IMU (9.46°), DenoiseIMU (2.09°), and Gyro-Net (1.49°). Additionally, our framework reduces the Absolute Trajectory Error (ATE) by 45.6% (average 0.070 m) and the Relative Pose Error by 48.2% (average 0.0043 m) compared with established baselines. These results highlight the effectiveness and robustness of our method, particularly in challenging scenarios with rapid motion and low-texture environments. Overall, our transformer–based approach significantly enhances the reliability and accuracy of IMU-based systems, offering a promising solution for autonomous navigation and related applications.

{"title":"A Robust Transformer–Based Error Compensation Method for Gyroscope of IMUs","authors":"Xin Ye,&nbsp;Weijia Xing,&nbsp;Zhilou Yu,&nbsp;Jichen Chen,&nbsp;Qingyue Ma","doi":"10.1002/rob.70082","DOIUrl":"https://doi.org/10.1002/rob.70082","url":null,"abstract":"<p>Inertial Measurement Units (IMUs), comprising gyroscopes and accelerometers, are fundamental for motion estimation in navigation and robotics. However, their performance is often degraded by nonlinear and time-varying errors, such as bias drift, scale-factor deviations, and sensor noise. Traditional compensation methods based on linear assumptions or static models struggle to address the dynamic and correlated nature of these errors, limiting real-time calibration robustness. To address this challenge, we propose a transformer-based framework for gyroscope error compensation, which dynamically models temporal dependencies and nonlinear error characteristics using self-attention mechanisms. Our approach incorporates a sliding window to exploit historical sensor data and applies a geometrically constrained loss function defined on the SO(3) Lie group, ensuring physically consistent orientation estimates. Comprehensive experiments on the public EuRoC and Technical University of Munich data sets demonstrate that our method achieves a mean orientation error of 1.56°, outperforming state-of-the-art approaches including robust 3-D orientation estimation with a single particular IMU (9.46°), DenoiseIMU (2.09°), and Gyro-Net (1.49°). Additionally, our framework reduces the Absolute Trajectory Error (ATE) by 45.6% (average 0.070 m) and the Relative Pose Error by 48.2% (average 0.0043 m) compared with established baselines. These results highlight the effectiveness and robustness of our method, particularly in challenging scenarios with rapid motion and low-texture environments. Overall, our transformer–based approach significantly enhances the reliability and accuracy of IMU-based systems, offering a promising solution for autonomous navigation and related applications.</p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"43 2","pages":"932-948"},"PeriodicalIF":5.2,"publicationDate":"2025-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.70082","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146136337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Motion Path Analysis for Differential Drive Robots Using Single Video Camera and Perspective Transformations 基于单摄像机和视角变换的差动驱动机器人运动路径分析
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-03 DOI: 10.1002/rob.70065
Mohd Fazril Izhar Mohd Idris, Ahmad Ramli, Wan Zafira Ezza Wan Zakaria

This study presents a novel method for analyzing the motion path of a differential drive mobile robot using a single video camera and perspective transformation. The robot utilized is the Zoom:Bit by Cytron Technologies, with motion planning based on clothoids, specifically the C1-Continuous Double Clothoid Segments introduced by Idris et al. The robot's movement is generated by calculating wheel speeds along the clothoid path, with the technical configuration also discussed in detail. In contrast to conventional methods that depend on automatic tracking systems such as GPS and sensor fusion, this approach adopts a cost-effective setup using a single camera mounted on a tripod to capture the robot's motion. The recorded video is processed using Mathematica, where keyframes are extracted, and the robot's coordinates are automatically identified at specific time intervals. Perspective transformation is then employed to convert the recorded 3D motion into a 2D plane, enabling a detailed comparison between the robot's actual trajectories and simulated results. The findings underscore the feasibility and accuracy of using a single-camera system for 3D motion capture and validate the efficiency of C1-Continuous Double Clothoid Segments for motion planning, offering a practical and cost-efficient solution for motion path analysis in mobile robotics.

{"title":"Motion Path Analysis for Differential Drive Robots Using Single Video Camera and Perspective Transformations","authors":"Mohd Fazril Izhar Mohd Idris,&nbsp;Ahmad Ramli,&nbsp;Wan Zafira Ezza Wan Zakaria","doi":"10.1002/rob.70065","DOIUrl":"https://doi.org/10.1002/rob.70065","url":null,"abstract":"<div>\u0000 \u0000 <p>This study presents a novel method for analyzing the motion path of a differential drive mobile robot using a single video camera and perspective transformation. The robot utilized is the Zoom:Bit by Cytron Technologies, with motion planning based on clothoids, specifically the C<sup>1</sup>-Continuous Double Clothoid Segments introduced by Idris et al. The robot's movement is generated by calculating wheel speeds along the clothoid path, with the technical configuration also discussed in detail. In contrast to conventional methods that depend on automatic tracking systems such as GPS and sensor fusion, this approach adopts a cost-effective setup using a single camera mounted on a tripod to capture the robot's motion. The recorded video is processed using Mathematica, where keyframes are extracted, and the robot's coordinates are automatically identified at specific time intervals. Perspective transformation is then employed to convert the recorded 3D motion into a 2D plane, enabling a detailed comparison between the robot's actual trajectories and simulated results. The findings underscore the feasibility and accuracy of using a single-camera system for 3D motion capture and validate the efficiency of C<sup>1</sup>-Continuous Double Clothoid Segments for motion planning, offering a practical and cost-efficient solution for motion path analysis in mobile robotics.</p>\u0000 </div>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"43 2","pages":"835-846"},"PeriodicalIF":5.2,"publicationDate":"2025-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146136036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Reliable Omnidirectional Acoustic Guidance Method for AUV Recovery and Homing 一种可靠的水下航行器回收与寻的全向声制导方法
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-02 DOI: 10.1002/rob.70066
Li Shuchang, Jiang Yanqing, Wang Jialin, Gao Rui, Gao Jingxuan, Zhang Yiyang, Li Yueming, Li Ye

A single-beacon, ranging-based acoustic guidance method is proposed to ensure reliable, rapid, and omnidirectional homing guidance for the recovery of autonomous underwater vehicle (AUV). To address the navigation system deviations caused by prolonged underwater operations of both the AUV and the mothership, the method first unifies their navigation datums. It then uses limited moving target information and ranging data to estimate the position of the moving target, enabling effective AUV recovery and homing guidance. This method is theoretically applicable across a wide range of distances—from tens of meters to several kilometers—and is well-suited for autonomous homing. It was adopted for its ability to overcome the key limitations of traditional ultrashort baseline systems, which are constrained by non-omnidirectional localization and susceptibility to positional errors resulting from relative positioning between the AUV and the mothership. Results from trials at the Danjiangkou Reservoir demonstrate, for the first time, that an AUV can be accurately guided to the vicinity of the mothership using measurement data from a single acoustic communication device. To evaluate the adaptability of the proposed acoustic guidance algorithm under large initial position errors, 14 field experiments were conducted with the mothership operating under both dynamic positioning (DP) and moving conditions. The results showed that homing deviation was consistently maintained within 8 m, with best-case values of 2.6 m (DP) and 3.5 m (moving), and corresponding average deviations of 5.8 and 6.5 m. These findings confirm the robustness and effectiveness of the proposed method under challenging initial deviation scenarios.

{"title":"A Reliable Omnidirectional Acoustic Guidance Method for AUV Recovery and Homing","authors":"Li Shuchang,&nbsp;Jiang Yanqing,&nbsp;Wang Jialin,&nbsp;Gao Rui,&nbsp;Gao Jingxuan,&nbsp;Zhang Yiyang,&nbsp;Li Yueming,&nbsp;Li Ye","doi":"10.1002/rob.70066","DOIUrl":"https://doi.org/10.1002/rob.70066","url":null,"abstract":"<div>\u0000 \u0000 <p>A single-beacon, ranging-based acoustic guidance method is proposed to ensure reliable, rapid, and omnidirectional homing guidance for the recovery of autonomous underwater vehicle (AUV). To address the navigation system deviations caused by prolonged underwater operations of both the AUV and the mothership, the method first unifies their navigation datums. It then uses limited moving target information and ranging data to estimate the position of the moving target, enabling effective AUV recovery and homing guidance. This method is theoretically applicable across a wide range of distances—from tens of meters to several kilometers—and is well-suited for autonomous homing. It was adopted for its ability to overcome the key limitations of traditional ultrashort baseline systems, which are constrained by non-omnidirectional localization and susceptibility to positional errors resulting from relative positioning between the AUV and the mothership. Results from trials at the Danjiangkou Reservoir demonstrate, for the first time, that an AUV can be accurately guided to the vicinity of the mothership using measurement data from a single acoustic communication device. To evaluate the adaptability of the proposed acoustic guidance algorithm under large initial position errors, 14 field experiments were conducted with the mothership operating under both dynamic positioning (DP) and moving conditions. The results showed that homing deviation was consistently maintained within 8 m, with best-case values of 2.6 m (DP) and 3.5 m (moving), and corresponding average deviations of 5.8 and 6.5 m. These findings confirm the robustness and effectiveness of the proposed method under challenging initial deviation scenarios.</p>\u0000 </div>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"43 2","pages":"816-834"},"PeriodicalIF":5.2,"publicationDate":"2025-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146135841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of a Grasshopper-Leg-Inspired Back-Type Exoskeleton for the Reduction of Muscle Activation During Stoop Activities 一种受蚱蜢腿启发的背部型外骨骼的开发,用于减少弯腰活动时的肌肉激活
IF 5.2 2区 计算机科学 Q2 ROBOTICS Pub Date : 2025-09-01 DOI: 10.1002/rob.70062
Dang Khanh Linh Le, Wei Chih Lin

Farmers frequently suffer from musculoskeletal disorders, particularly lower back pain (LBP), mainly due to occupational ergonomic factors such as repetitive stooping and lifting. This study proposes using an exoskeleton to reduce LBP risk by providing external torque to the hip joints. The grasshopper-leg-inspired exoskeleton (GIExo) incorporates a spring actuator mimicking the lump mechanism of grasshopper legs and an actively controlled clutch to toggle assistance. Fifteen farmers (n = 15) participated in rice field experiments under three conditions: without assistance, wearing GIExo without activation and with assistance. Electromyography (EMG) was used to monitor muscle activation while farmers performed lifting and rolling tasks. The Anybody software estimated compression and impact forces on the body before and after wearing GIExo. A body tracking experiment using YOLOv8 was conducted to evaluate the influence of GIExo on the natural movement of users. Results from Friedman test showed that GIExo reduced average activation of the upper thoracic erector spinae (TES) and lumbar erector spinae (LES) by (28.5%−35.5%) and (31.9%−35.2%), respectively, during lifting tasks, and by (52.1%−45.4%) and (64.6%−68.2%), respectively, during rolling tasks. The compression force between the 5th lumbar vertebra (L5) and sacrum (S1) decreased by 7.7% with GIExo. Furthermore, when the exoskeleton was worn in the without activation state, it did not restrict the user's range of motion. Additionally, perceived exertion and acceptance surveys evaluated user feedback on physical effort, comfort, and usability. Responses indicated reduced fatigue and overall acceptance of GIExo in real working conditions. These findings suggest that GIExo is a promising solution for reducing fatigue and musculoskeletal disorders, improving ergonomic conditions in labor-intensive agricultural tasks such as lifting and stooping.

{"title":"Development of a Grasshopper-Leg-Inspired Back-Type Exoskeleton for the Reduction of Muscle Activation During Stoop Activities","authors":"Dang Khanh Linh Le,&nbsp;Wei Chih Lin","doi":"10.1002/rob.70062","DOIUrl":"https://doi.org/10.1002/rob.70062","url":null,"abstract":"<p>Farmers frequently suffer from musculoskeletal disorders, particularly lower back pain (LBP), mainly due to occupational ergonomic factors such as repetitive stooping and lifting. This study proposes using an exoskeleton to reduce LBP risk by providing external torque to the hip joints. The grasshopper-leg-inspired exoskeleton (GIExo) incorporates a spring actuator mimicking the lump mechanism of grasshopper legs and an actively controlled clutch to toggle assistance. Fifteen farmers (<i>n</i> = 15) participated in rice field experiments under three conditions: without assistance, wearing GIExo without activation and with assistance. Electromyography (EMG) was used to monitor muscle activation while farmers performed lifting and rolling tasks. The Anybody software estimated compression and impact forces on the body before and after wearing GIExo. A body tracking experiment using YOLOv8 was conducted to evaluate the influence of GIExo on the natural movement of users. Results from Friedman test showed that GIExo reduced average activation of the upper thoracic erector spinae (TES) and lumbar erector spinae (LES) by (28.5%−35.5%) and (31.9%−35.2%), respectively, during lifting tasks, and by (52.1%−45.4%) and (64.6%−68.2%), respectively, during rolling tasks. The compression force between the 5th lumbar vertebra (L5) and sacrum (S1) decreased by 7.7% with GIExo. Furthermore, when the exoskeleton was worn in the without activation state, it did not restrict the user's range of motion. Additionally, perceived exertion and acceptance surveys evaluated user feedback on physical effort, comfort, and usability. Responses indicated reduced fatigue and overall acceptance of GIExo in real working conditions. These findings suggest that GIExo is a promising solution for reducing fatigue and musculoskeletal disorders, improving ergonomic conditions in labor-intensive agricultural tasks such as lifting and stooping.</p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"43 2","pages":"739-760"},"PeriodicalIF":5.2,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.70062","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146135867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Journal of Field Robotics
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1