Chenrui Kang, Lin Jiao, Kang Liu, Zhigui Liu, Rujing Wang
{"title":"Precise Crop Pest Detection Based on Co-Ordinate-Attention-Based Feature Pyramid Module.","authors":"Chenrui Kang, Lin Jiao, Kang Liu, Zhigui Liu, Rujing Wang","doi":"10.3390/insects16010103","DOIUrl":null,"url":null,"abstract":"<p><p>Insect pests strongly affect crop growth and value globally. Fast and precise pest detection and counting are crucial measures in the management and mitigation of pest infestations. In this area, deep learning technologies have come to represent the method with the most potential. However, for small-sized crop pests, recent deep-learning-based detection attempts have not accomplished accurate recognition and detection due to the challenges posed by feature extraction and positive and negative sample selection. Therefore, to overcome these limitations, we first designed a co-ordinate-attention-based feature pyramid network, termed CAFPN, to extract the salient visual features that distinguish small insects from each other. Subsequently, in the network training stage, a dynamic sample selection strategy using positive and negative weight functions, which considers both high classification scores and precise localization, was introduced. Finally, several experiments were conducted on our constructed large-scale crop pest datasets, the AgriPest 21 dataset and the IP102 dateset, achieving accuracy scores of 77.2% and 29.8% for mAP (mean average precision), demonstrating promising detection results when compared to other detectors.</p>","PeriodicalId":13642,"journal":{"name":"Insects","volume":"16 1","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2025-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11765758/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Insects","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.3390/insects16010103","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENTOMOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Insect pests strongly affect crop growth and value globally. Fast and precise pest detection and counting are crucial measures in the management and mitigation of pest infestations. In this area, deep learning technologies have come to represent the method with the most potential. However, for small-sized crop pests, recent deep-learning-based detection attempts have not accomplished accurate recognition and detection due to the challenges posed by feature extraction and positive and negative sample selection. Therefore, to overcome these limitations, we first designed a co-ordinate-attention-based feature pyramid network, termed CAFPN, to extract the salient visual features that distinguish small insects from each other. Subsequently, in the network training stage, a dynamic sample selection strategy using positive and negative weight functions, which considers both high classification scores and precise localization, was introduced. Finally, several experiments were conducted on our constructed large-scale crop pest datasets, the AgriPest 21 dataset and the IP102 dateset, achieving accuracy scores of 77.2% and 29.8% for mAP (mean average precision), demonstrating promising detection results when compared to other detectors.
InsectsAgricultural and Biological Sciences-Insect Science
CiteScore
5.10
自引率
10.00%
发文量
1013
审稿时长
21.77 days
期刊介绍:
Insects (ISSN 2075-4450) is an international, peer-reviewed open access journal of entomology published by MDPI online quarterly. It publishes reviews, research papers and communications related to the biology, physiology and the behavior of insects and arthropods. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced. Electronic files regarding the full details of the experimental procedure, if unable to be published in a normal way, can be deposited as supplementary material.