{"title":"Deep Learning Based Visual Servo for Autonomous Aircraft Refueling","authors":"Natthaphop Phatthamolrat, Teerawat Tongloy, Siridech Boonsang, Santhad Chuwongin","doi":"10.1002/eng2.70055","DOIUrl":null,"url":null,"abstract":"<p>This study develops and evaluates a deep learning based visual servoing (DLBVS) control system for guiding industrial robots during aircraft refueling, aiming to enhance operational efficiency and precision. The system employs a monocular camera mounted on the robot's end effector to capture images of target objects—the refueling nozzle and bottom loading adapter—eliminating the need for prior calibration and simplifying real-world implementation. Using deep learning, the system identifies feature points on these objects to estimate their pose estimation, providing essential data for precise manipulation. The proposed method integrates two-stage neural networks with the Efficient Perspective-n-Point (EPnP) principle to determine the orientation and rotation angles, while an approximation principle based on feature point errors calculates linear positions. The DLBVS system effectively commands the robot arm to approach and interact with the targets, demonstrating reliable performance even under positional deviations. Quantitative results show translational errors below 0.5 mm and rotational errors under 1.5° for both the nozzle and adapter, showcasing the system's capability for intricate refueling operations. This work contributes a practical, calibration-free solution for enhancing automation in aerospace applications. The videos and data sets from the research are publicly accessible at https://tinyurl.com/CiRAxDLBVS.</p>","PeriodicalId":72922,"journal":{"name":"Engineering reports : open access","volume":"7 3","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2025-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/eng2.70055","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering reports : open access","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/eng2.70055","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
This study develops and evaluates a deep learning based visual servoing (DLBVS) control system for guiding industrial robots during aircraft refueling, aiming to enhance operational efficiency and precision. The system employs a monocular camera mounted on the robot's end effector to capture images of target objects—the refueling nozzle and bottom loading adapter—eliminating the need for prior calibration and simplifying real-world implementation. Using deep learning, the system identifies feature points on these objects to estimate their pose estimation, providing essential data for precise manipulation. The proposed method integrates two-stage neural networks with the Efficient Perspective-n-Point (EPnP) principle to determine the orientation and rotation angles, while an approximation principle based on feature point errors calculates linear positions. The DLBVS system effectively commands the robot arm to approach and interact with the targets, demonstrating reliable performance even under positional deviations. Quantitative results show translational errors below 0.5 mm and rotational errors under 1.5° for both the nozzle and adapter, showcasing the system's capability for intricate refueling operations. This work contributes a practical, calibration-free solution for enhancing automation in aerospace applications. The videos and data sets from the research are publicly accessible at https://tinyurl.com/CiRAxDLBVS.