{"title":"基于视觉的航向估计,利用消失点在全球导航卫星系统失效的阶梯环境中为微型飞行器导航","authors":"B. Anbarasu","doi":"10.1007/s42401-024-00282-5","DOIUrl":null,"url":null,"abstract":"<div><p>Micro-aerial vehicles (MAVs) find it extremely difficult to navigate in GNSS-denied indoor staircase environments with obstructed Global navigation satellite system (GNSS) signals. To avoid hitting both static and moving obstacles, MAV must estimate its position and heading in the staircase indoor scenes. In order to detect vanishing points and estimate heading for MAV navigation in a staircase environment, five different input colour space image frames—namely RGB image into a grayscale image and RGB image into hyper-opponent colour space—O1, O2, O3, and Sobel R channel image frames—have been used in this work. To determine the position and direction of the MAV, the Hough transform technique and K-means clustering algorithm have been incorporated for line and vanishing point recognition in the staircase image frames. The position of the vanishing point detected in the staircase image frames indicates the position of the MAV (Centre, left or right) in the staircase. In addition, to compute the heading of MAV, the Euclidean distance between the staircase picture centre, mid-pixel coordinates at the image’s last row, and the detected vanishing point pixel coordinates in the succeeding staircase image frames are used. The position and heading measurement can be utilised to send the MAV a suitable control signal and align it at the centre of the staircase when it deviates from the centre. The integrated Hough transform technique and K-means clustering-based vanishing point detection are suitable for real-time MAV heading measurement using the O2 channel staircase image frames for indoor MAVs with a high accuracy of ± 0.15° when compared to the state-of-the-art grid-based vanishing point detection method heading accuracy of ± 1.5°.</p></div>","PeriodicalId":36309,"journal":{"name":"Aerospace Systems","volume":"7 2","pages":"395 - 418"},"PeriodicalIF":0.0000,"publicationDate":"2024-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Vision-based heading estimation for navigation of a micro-aerial vehicle in GNSS-denied staircase environment using vanishing point\",\"authors\":\"B. Anbarasu\",\"doi\":\"10.1007/s42401-024-00282-5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Micro-aerial vehicles (MAVs) find it extremely difficult to navigate in GNSS-denied indoor staircase environments with obstructed Global navigation satellite system (GNSS) signals. To avoid hitting both static and moving obstacles, MAV must estimate its position and heading in the staircase indoor scenes. In order to detect vanishing points and estimate heading for MAV navigation in a staircase environment, five different input colour space image frames—namely RGB image into a grayscale image and RGB image into hyper-opponent colour space—O1, O2, O3, and Sobel R channel image frames—have been used in this work. To determine the position and direction of the MAV, the Hough transform technique and K-means clustering algorithm have been incorporated for line and vanishing point recognition in the staircase image frames. The position of the vanishing point detected in the staircase image frames indicates the position of the MAV (Centre, left or right) in the staircase. In addition, to compute the heading of MAV, the Euclidean distance between the staircase picture centre, mid-pixel coordinates at the image’s last row, and the detected vanishing point pixel coordinates in the succeeding staircase image frames are used. The position and heading measurement can be utilised to send the MAV a suitable control signal and align it at the centre of the staircase when it deviates from the centre. The integrated Hough transform technique and K-means clustering-based vanishing point detection are suitable for real-time MAV heading measurement using the O2 channel staircase image frames for indoor MAVs with a high accuracy of ± 0.15° when compared to the state-of-the-art grid-based vanishing point detection method heading accuracy of ± 1.5°.</p></div>\",\"PeriodicalId\":36309,\"journal\":{\"name\":\"Aerospace Systems\",\"volume\":\"7 2\",\"pages\":\"395 - 418\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-03-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Aerospace Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s42401-024-00282-5\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Earth and Planetary Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Aerospace Systems","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s42401-024-00282-5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Earth and Planetary Sciences","Score":null,"Total":0}
Vision-based heading estimation for navigation of a micro-aerial vehicle in GNSS-denied staircase environment using vanishing point
Micro-aerial vehicles (MAVs) find it extremely difficult to navigate in GNSS-denied indoor staircase environments with obstructed Global navigation satellite system (GNSS) signals. To avoid hitting both static and moving obstacles, MAV must estimate its position and heading in the staircase indoor scenes. In order to detect vanishing points and estimate heading for MAV navigation in a staircase environment, five different input colour space image frames—namely RGB image into a grayscale image and RGB image into hyper-opponent colour space—O1, O2, O3, and Sobel R channel image frames—have been used in this work. To determine the position and direction of the MAV, the Hough transform technique and K-means clustering algorithm have been incorporated for line and vanishing point recognition in the staircase image frames. The position of the vanishing point detected in the staircase image frames indicates the position of the MAV (Centre, left or right) in the staircase. In addition, to compute the heading of MAV, the Euclidean distance between the staircase picture centre, mid-pixel coordinates at the image’s last row, and the detected vanishing point pixel coordinates in the succeeding staircase image frames are used. The position and heading measurement can be utilised to send the MAV a suitable control signal and align it at the centre of the staircase when it deviates from the centre. The integrated Hough transform technique and K-means clustering-based vanishing point detection are suitable for real-time MAV heading measurement using the O2 channel staircase image frames for indoor MAVs with a high accuracy of ± 0.15° when compared to the state-of-the-art grid-based vanishing point detection method heading accuracy of ± 1.5°.
期刊介绍:
Aerospace Systems provides an international, peer-reviewed forum which focuses on system-level research and development regarding aeronautics and astronautics. The journal emphasizes the unique role and increasing importance of informatics on aerospace. It fills a gap in current publishing coverage from outer space vehicles to atmospheric vehicles by highlighting interdisciplinary science, technology and engineering.
Potential topics include, but are not limited to:
Trans-space vehicle systems design and integration
Air vehicle systems
Space vehicle systems
Near-space vehicle systems
Aerospace robotics and unmanned system
Communication, navigation and surveillance
Aerodynamics and aircraft design
Dynamics and control
Aerospace propulsion
Avionics system
Opto-electronic system
Air traffic management
Earth observation
Deep space exploration
Bionic micro-aircraft/spacecraft
Intelligent sensing and Information fusion