Aerial-Based Weed Detection Using Low-Cost and Lightweight Deep Learning Models on an Edge Platform

IF 1.2 4区 农林科学 Q3 AGRICULTURAL ENGINEERING Journal of the ASABE Pub Date : 2023-01-01 DOI:10.13031/ja.15413
Nitin Rai, Xin Sun, C. Igathinathane, Kirk Howatt, Michael Ostlie
{"title":"Aerial-Based Weed Detection Using Low-Cost and Lightweight Deep Learning Models on an Edge Platform","authors":"Nitin Rai, Xin Sun, C. Igathinathane, Kirk Howatt, Michael Ostlie","doi":"10.13031/ja.15413","DOIUrl":null,"url":null,"abstract":"Highlights Lightweight deep learning models were trained on an edge device to identify weeds in aerial images. A customized configuration file was setup to train the models. These models were deployed to detect weeds in aerial images and videos (near real-time). CSPMobileNet-v2 and YOLOv4-lite are recommended models for weed detection using edge platform. Abstract. Deep learning (DL) techniques have proven to be a successful approach in detecting weeds for site-specific weed management (SSWM). In the past, most of the research work has trained and deployed pre-trained DL models on high-end systems coupled with expensive graphical processing units (GPUs). However, only a limited number of research studies have used DL models on an edge system for aerial-based weed detection. Therefore, while focusing on hardware cost minimization, eight DL models were trained and deployed on an edge device to detect weeds in aerial-image context and videos in this study. Four large models, namely CSPDarkNet-53, DarkNet-53, DenseNet-201, and ResNet-50, along with four lightweight models, CSPMobileNet-v2, YOLOv4-lite, EfficientNet-B0, and DarkNet-Ref, were considered for training a customized DL architecture. Along with trained model performance scores (average precision score, mean average precision (mAP), intersection over union, precision, and recall), other model metrics to assess edge system performance such as billion floating-point operations/s (BFLOPS), frame rates/s (FPS), and GPU memory usage were also estimated. The lightweight CSPMobileNet-v2 and YOLOv4-lite models outperformed others in detecting weeds in aerial image context. These models were able to achieve a mAP score of 83.2% and 82.2%, delivering an FPS of 60.9 and 61.1 during near real-time weed detection in aerial videos, respectively. The popular ResNet-50 model achieved a mAP of 79.6%, which was the highest amongst all the large models deployed for weed detection tasks. Based on the results, the two lightweight models, namely, CSPMobileNet-v2 and YOLOv4-lite, are recommended, and they can be used on a low-cost edge system to detect weeds in aerial image context with significant accuracy. Keywords: Aerial image, Deep learning, Edge device, Precision agriculture, Weed detection.","PeriodicalId":29714,"journal":{"name":"Journal of the ASABE","volume":"106 1","pages":"0"},"PeriodicalIF":1.2000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the ASABE","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.13031/ja.15413","RegionNum":4,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 1

Abstract

Highlights Lightweight deep learning models were trained on an edge device to identify weeds in aerial images. A customized configuration file was setup to train the models. These models were deployed to detect weeds in aerial images and videos (near real-time). CSPMobileNet-v2 and YOLOv4-lite are recommended models for weed detection using edge platform. Abstract. Deep learning (DL) techniques have proven to be a successful approach in detecting weeds for site-specific weed management (SSWM). In the past, most of the research work has trained and deployed pre-trained DL models on high-end systems coupled with expensive graphical processing units (GPUs). However, only a limited number of research studies have used DL models on an edge system for aerial-based weed detection. Therefore, while focusing on hardware cost minimization, eight DL models were trained and deployed on an edge device to detect weeds in aerial-image context and videos in this study. Four large models, namely CSPDarkNet-53, DarkNet-53, DenseNet-201, and ResNet-50, along with four lightweight models, CSPMobileNet-v2, YOLOv4-lite, EfficientNet-B0, and DarkNet-Ref, were considered for training a customized DL architecture. Along with trained model performance scores (average precision score, mean average precision (mAP), intersection over union, precision, and recall), other model metrics to assess edge system performance such as billion floating-point operations/s (BFLOPS), frame rates/s (FPS), and GPU memory usage were also estimated. The lightweight CSPMobileNet-v2 and YOLOv4-lite models outperformed others in detecting weeds in aerial image context. These models were able to achieve a mAP score of 83.2% and 82.2%, delivering an FPS of 60.9 and 61.1 during near real-time weed detection in aerial videos, respectively. The popular ResNet-50 model achieved a mAP of 79.6%, which was the highest amongst all the large models deployed for weed detection tasks. Based on the results, the two lightweight models, namely, CSPMobileNet-v2 and YOLOv4-lite, are recommended, and they can be used on a low-cost edge system to detect weeds in aerial image context with significant accuracy. Keywords: Aerial image, Deep learning, Edge device, Precision agriculture, Weed detection.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于边缘平台的低成本轻量级深度学习模型的空中杂草检测
在边缘设备上训练轻量级深度学习模型来识别航拍图像中的杂草。设置了一个定制的配置文件来训练模型。这些模型被用于检测航拍图像和视频中的杂草(接近实时)。CSPMobileNet-v2和YOLOv4-lite是边缘平台杂草检测的推荐模型。摘要深度学习(DL)技术已被证明是一种成功的杂草检测方法,用于特定地点的杂草管理(SSWM)。过去,大多数研究工作都是在高端系统上训练和部署预训练的深度学习模型,这些系统配备了昂贵的图形处理单元(gpu)。然而,只有有限的研究将深度学习模型用于边缘系统的空中杂草检测。因此,本研究在关注硬件成本最小化的同时,训练了8个深度学习模型并将其部署在边缘设备上,以检测航空图像上下文和视频中的杂草。四个大型模型,即CSPDarkNet-53, DarkNet-53, DenseNet-201和ResNet-50,以及四个轻量级模型,CSPMobileNet-v2, YOLOv4-lite, EfficientNet-B0和DarkNet-Ref,被考虑用于训练定制的DL架构。除了训练的模型性能分数(平均精度分数、平均平均精度(mAP)、交集/联合、精度和召回率)外,还估计了评估边缘系统性能的其他模型指标,如十亿浮点运算/秒(BFLOPS)、帧率/秒(FPS)和GPU内存使用情况。轻量级CSPMobileNet-v2和YOLOv4-lite模型在航空图像环境中检测杂草方面优于其他模型。这些模型能够实现83.2%和82.2%的mAP分数,在航拍视频的近实时杂草检测中分别提供60.9和61.1的FPS。流行的ResNet-50模型实现了79.6%的mAP,这是用于杂草检测任务的所有大型模型中最高的。在此基础上,推荐了CSPMobileNet-v2和YOLOv4-lite两种轻量级模型,它们可以在低成本的边缘系统上用于航拍图像背景下的杂草检测,并且精度很高。关键词:航拍图像,深度学习,边缘设备,精准农业,杂草检测
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
3.10
自引率
0.00%
发文量
0
期刊最新文献
Application of Uniaxial Compression Curve Fractal Dimension in the Identification of Cañihua (Chenopodium Pallidicaule Aellen) Grain Cultivars Calculation of Swath Width and Swath Displacement for Uncrewed Aerial Spray Systems Evaluating Draft EPA Emissions Models for Laying Hen Facilities Calibration and Validation of RZWQM2-P Model to Simulate Phosphorus Loss in a Clay Loam Soil in Michigan Investigation of Depth Camera Potentials for Variable-Rate Sprayers
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1