Kaiwen Wang , Lammert Kooistra , Yaowu Wang , Sergio Vélez , Wensheng Wang , João Valente
{"title":"葡萄园中基于单目摄像头无人机的定位和绘图方法基准测试","authors":"Kaiwen Wang , Lammert Kooistra , Yaowu Wang , Sergio Vélez , Wensheng Wang , João Valente","doi":"10.1016/j.compag.2024.109661","DOIUrl":null,"url":null,"abstract":"<div><div>UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature.</div><div>To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark.</div><div>Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"227 ","pages":"Article 109661"},"PeriodicalIF":7.7000,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Benchmarking of monocular camera UAV-based localization and mapping methods in vineyards\",\"authors\":\"Kaiwen Wang , Lammert Kooistra , Yaowu Wang , Sergio Vélez , Wensheng Wang , João Valente\",\"doi\":\"10.1016/j.compag.2024.109661\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature.</div><div>To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark.</div><div>Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":\"227 \",\"pages\":\"Article 109661\"},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169924010524\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169924010524","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
摘要
配备各种传感器的无人机为提高果园管理效率提供了一种前景广阔的方法。近距离传感可实现精确的作物定位和绘图,为知情决策提供宝贵的先验信息。目前关于定位和绘图方法的研究大致可分为 SfM、基于特征的传统 SLAM 和深度学习集成 SLAM。为了弥补这一差距,我们在葡萄园开展了一项综合研究,使用配备高分辨率摄像头的无人机在不同条件(飞行模式、光照条件和拍摄角度)下收集数据。为了评估不同方法的性能,我们提出了五个评估指标:效率、点云完整性、定位精度、参数灵敏度和植物级空间精度。我们将两种 SLAM 方法与 SfM 作为基准进行了比较。我们的研究结果表明,基于深度学习的 SLAM 在定位精度和点云分辨率方面优于 SfM 和基于特征的 SLAM。基于深度学习的 SLAM 将平均位置误差降低了 87%,将点云分辨率提高了 571%。然而,基于特征的 SLAM 表现出更高的效率,因此更适合实时应用。这些结果为在果园管理活动中选择合适的方法、考虑光照条件和优化参数以平衡精度和计算效率提供了宝贵的启示。
Benchmarking of monocular camera UAV-based localization and mapping methods in vineyards
UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature.
To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark.
Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.