{"title":"A Novel Computer Vision System for Efficient Flea Beetle Monitoring in Canola Crop","authors":"Muhib Ullah;Muhammad Shabbir Hasan;Abdul Bais;Tyler Wist;Shaun Sharpe","doi":"10.1109/TAFE.2024.3406329","DOIUrl":null,"url":null,"abstract":"Effective crop health monitoring is essential for farmers to make informed decisions about managing their crops. In canola crop management, the rapid proliferation of flea beetle (FB) populations is a major concern, as these pests can cause significant crop damage. Traditional manual field monitoring for FBs is time consuming and error-prone due to its reliance on visual assessments of FB damage to small seedlings, making conducting frequent and accurate surveys difficult. One of the key pieces of information in assessing if control of FBs is required is the presence of live FBs in the canola crop. This article proposes a novel insect-monitoring framework that uses a solar-powered, intelligent trap called the smart insect trap (SIT), equipped with a high-resolution camera and a deep-learning-based object detection network. Using this SIT, coupled with a kairomonal lure, the FB population can be monitored hourly, and population increases can be identified quickly. The SIT processes images at the edge and sends results to the cloud every 40 min for FB monitoring and analysis. It uses a modified you look only once version 8 small (YOLOv8s) object detection network, FB-YOLO, to improve its ability to detect small FBs. The modification is implemented in the network's neck, which aggregates features from the deep and early pyramids of the backbone in the neck. Improved attention to small objects is achieved by incorporating spatially aware features from early pyramids. In addition, the network is integrated with an advanced box selection algorithm called confluence nonmax suppression (NMS-C) to prevent duplicate detections in highly overlapped clusters of FBs. The FB-YOLO achieved an average precision (\n<inline-formula><tex-math>$\\text{mAP}@0.5$</tex-math></inline-formula>\n) of 89.97%, a 1.215% improvement over the YOLOv8s network with only 0.324 million additional parameters. Integrating NMS-C further improved the \n<inline-formula><tex-math>$\\text{mAP}@0.5$</tex-math></inline-formula>\n by 0.19%, leading to an overall \n<inline-formula><tex-math>$\\text{mAP}@0.5$</tex-math></inline-formula>\n of 90.16%.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"483-496"},"PeriodicalIF":0.0000,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on AgriFood Electronics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10589284/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Effective crop health monitoring is essential for farmers to make informed decisions about managing their crops. In canola crop management, the rapid proliferation of flea beetle (FB) populations is a major concern, as these pests can cause significant crop damage. Traditional manual field monitoring for FBs is time consuming and error-prone due to its reliance on visual assessments of FB damage to small seedlings, making conducting frequent and accurate surveys difficult. One of the key pieces of information in assessing if control of FBs is required is the presence of live FBs in the canola crop. This article proposes a novel insect-monitoring framework that uses a solar-powered, intelligent trap called the smart insect trap (SIT), equipped with a high-resolution camera and a deep-learning-based object detection network. Using this SIT, coupled with a kairomonal lure, the FB population can be monitored hourly, and population increases can be identified quickly. The SIT processes images at the edge and sends results to the cloud every 40 min for FB monitoring and analysis. It uses a modified you look only once version 8 small (YOLOv8s) object detection network, FB-YOLO, to improve its ability to detect small FBs. The modification is implemented in the network's neck, which aggregates features from the deep and early pyramids of the backbone in the neck. Improved attention to small objects is achieved by incorporating spatially aware features from early pyramids. In addition, the network is integrated with an advanced box selection algorithm called confluence nonmax suppression (NMS-C) to prevent duplicate detections in highly overlapped clusters of FBs. The FB-YOLO achieved an average precision (
$\text{mAP}@0.5$
) of 89.97%, a 1.215% improvement over the YOLOv8s network with only 0.324 million additional parameters. Integrating NMS-C further improved the
$\text{mAP}@0.5$
by 0.19%, leading to an overall
$\text{mAP}@0.5$
of 90.16%.