葡萄园中基于单目摄像头无人机的定位和绘图方法基准测试

IF 7.7 1区 农林科学 Q1 AGRICULTURE, MULTIDISCIPLINARY Computers and Electronics in Agriculture Pub Date : 2024-11-15 DOI:10.1016/j.compag.2024.109661
Kaiwen Wang , Lammert Kooistra , Yaowu Wang , Sergio Vélez , Wensheng Wang , João Valente
{"title":"葡萄园中基于单目摄像头无人机的定位和绘图方法基准测试","authors":"Kaiwen Wang ,&nbsp;Lammert Kooistra ,&nbsp;Yaowu Wang ,&nbsp;Sergio Vélez ,&nbsp;Wensheng Wang ,&nbsp;João Valente","doi":"10.1016/j.compag.2024.109661","DOIUrl":null,"url":null,"abstract":"<div><div>UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature.</div><div>To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark.</div><div>Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"227 ","pages":"Article 109661"},"PeriodicalIF":7.7000,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Benchmarking of monocular camera UAV-based localization and mapping methods in vineyards\",\"authors\":\"Kaiwen Wang ,&nbsp;Lammert Kooistra ,&nbsp;Yaowu Wang ,&nbsp;Sergio Vélez ,&nbsp;Wensheng Wang ,&nbsp;João Valente\",\"doi\":\"10.1016/j.compag.2024.109661\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature.</div><div>To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark.</div><div>Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":\"227 \",\"pages\":\"Article 109661\"},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169924010524\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169924010524","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

配备各种传感器的无人机为提高果园管理效率提供了一种前景广阔的方法。近距离传感可实现精确的作物定位和绘图,为知情决策提供宝贵的先验信息。目前关于定位和绘图方法的研究大致可分为 SfM、基于特征的传统 SLAM 和深度学习集成 SLAM。为了弥补这一差距,我们在葡萄园开展了一项综合研究,使用配备高分辨率摄像头的无人机在不同条件(飞行模式、光照条件和拍摄角度)下收集数据。为了评估不同方法的性能,我们提出了五个评估指标:效率、点云完整性、定位精度、参数灵敏度和植物级空间精度。我们将两种 SLAM 方法与 SfM 作为基准进行了比较。我们的研究结果表明,基于深度学习的 SLAM 在定位精度和点云分辨率方面优于 SfM 和基于特征的 SLAM。基于深度学习的 SLAM 将平均位置误差降低了 87%,将点云分辨率提高了 571%。然而,基于特征的 SLAM 表现出更高的效率,因此更适合实时应用。这些结果为在果园管理活动中选择合适的方法、考虑光照条件和优化参数以平衡精度和计算效率提供了宝贵的启示。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Benchmarking of monocular camera UAV-based localization and mapping methods in vineyards
UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature.
To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark.
Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computers and Electronics in Agriculture
Computers and Electronics in Agriculture 工程技术-计算机:跨学科应用
CiteScore
15.30
自引率
14.50%
发文量
800
审稿时长
62 days
期刊介绍: Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.
期刊最新文献
Construction and validation of a mathematical model for the pressure subsidence of mixed crop straw in Shajiang black soil Fish feeding behavior recognition using time-domain and frequency-domain signals fusion from six-axis inertial sensors Estimation of crop leaf area index based on Sentinel-2 images and PROSAIL-Transformer coupling model Design, integration, and field evaluation of a selective harvesting robot for broccoli A Novel Behavior Detection Method for Sows and Piglets during Lactation Based on an Inspection Robot
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1