Quanxi Zhan, Yanmin Zhou, Junrui Zhang, Chenyang Sun, Runjie Shen, Bin He
{"title":"A novel method for measuring center-axis velocity of unmanned aerial vehicles through synthetic motion blur images","authors":"Quanxi Zhan, Yanmin Zhou, Junrui Zhang, Chenyang Sun, Runjie Shen, Bin He","doi":"10.1007/s43684-024-00073-x","DOIUrl":null,"url":null,"abstract":"<div><p>Accurate velocity measurement of unmanned aerial vehicles (UAVs) is essential in various applications. Traditional vision-based methods rely heavily on visual features, which are often inadequate in low-light or feature-sparse environments. This study presents a novel approach to measure the axial velocity of UAVs using motion blur images captured by a UAV-mounted monocular camera. We introduce a motion blur model that synthesizes imaging from neighboring frames to enhance motion blur visibility. The synthesized blur frames are transformed into spectrograms using the Fast Fourier Transform (FFT) technique. We then apply a binarization process and the Radon transform to extract light-dark stripe spacing, which represents the motion blur length. This length is used to establish a model correlating motion blur with axial velocity, allowing precise velocity calculation. Field tests in a hydropower station penstock demonstrated an average velocity error of 0.048 m/s compared to ultra-wideband (UWB) measurements. The root-mean-square error was 0.025, with an average computational time of 42.3 ms and CPU load of 17%. These results confirm the stability and accuracy of our velocity estimation algorithm in challenging environments.</p></div>","PeriodicalId":71187,"journal":{"name":"自主智能系统(英文)","volume":"4 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s43684-024-00073-x.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"自主智能系统(英文)","FirstCategoryId":"1093","ListUrlMain":"https://link.springer.com/article/10.1007/s43684-024-00073-x","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate velocity measurement of unmanned aerial vehicles (UAVs) is essential in various applications. Traditional vision-based methods rely heavily on visual features, which are often inadequate in low-light or feature-sparse environments. This study presents a novel approach to measure the axial velocity of UAVs using motion blur images captured by a UAV-mounted monocular camera. We introduce a motion blur model that synthesizes imaging from neighboring frames to enhance motion blur visibility. The synthesized blur frames are transformed into spectrograms using the Fast Fourier Transform (FFT) technique. We then apply a binarization process and the Radon transform to extract light-dark stripe spacing, which represents the motion blur length. This length is used to establish a model correlating motion blur with axial velocity, allowing precise velocity calculation. Field tests in a hydropower station penstock demonstrated an average velocity error of 0.048 m/s compared to ultra-wideband (UWB) measurements. The root-mean-square error was 0.025, with an average computational time of 42.3 ms and CPU load of 17%. These results confirm the stability and accuracy of our velocity estimation algorithm in challenging environments.