Dujin Wang, Yizhong Wang, Ming Li, Xinting Yang, Jianwei Wu, Wenyong Li
{"title":"使用改进的YOLOv4深度学习网络在粘捕器图像上准确检测白蝇和蓟马","authors":"Dujin Wang, Yizhong Wang, Ming Li, Xinting Yang, Jianwei Wu, Wenyong Li","doi":"10.13031/TRANS.14394","DOIUrl":null,"url":null,"abstract":"Highlights The proposed method detected thrips and whitefly more accurately than previous methods. The proposed method demonstrated good robustness to illumination reflections and different pest densities. Small pest detection is improved by adding large-scale feature maps and more residual units to a shallow network. Machine vision and deep learning create an end-to-end model to detect smallsmall pests on sticky traps in field conditions. Abstract. Pest detection is the basis of precise control in vegetable greenhouses. To improve the detection accuracy and robustness of two common small pests in greenhouses, whitefly and thrips, this study proposes a novel small object detection approach based on the YOLOv4 model. Yellow sticky trap (YST) images at the original resolution (2560x1920 pixels) were collected using a pest monitoring equipment in a greenhouse. They were then cropped and labeled to create the sub-images (416x416 pixels) to construct an experimental dataset. The labeled images of this study (900 training, 100 validation, and 200 test) are available for comparative studies. To enhance the model‘s ability to detect small pests, the feature map at the 8-fold downsampling layer in the backbone network was merged with the feature map at the 4-fold downsampling layer to generate a new layer and output a feature map with a size of 104x104 pixels. Furthermore, the residual units in the first two residual blocks are enlarged by four times to extract more shallow image features and the location information of target pests to withstand image degradation in the field. The experimental results show that the detection mAP of whitefly and thrips using the proposed approach is improved by 8.2% and 3.4% compared with the YOLOv3 and YOLOv4 models, respectively. The detection performance slightly decreases as the pest densities increase in the YST image, but the mAP value was still 92.7% in the high-density dataset, which indicates that the proposed model has good robustness over a range of pest densities. Compared with some previous similar studies, the proposed method has better potential to monitor whitefly and thrips using YSTs in field conditions.","PeriodicalId":23120,"journal":{"name":"Transactions of the ASABE","volume":"7 1","pages":"0"},"PeriodicalIF":1.4000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Using an Improved YOLOv4 Deep Learning Network for Accurate Detection of Whitefly and Thrips on Sticky Trap Images\",\"authors\":\"Dujin Wang, Yizhong Wang, Ming Li, Xinting Yang, Jianwei Wu, Wenyong Li\",\"doi\":\"10.13031/TRANS.14394\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Highlights The proposed method detected thrips and whitefly more accurately than previous methods. The proposed method demonstrated good robustness to illumination reflections and different pest densities. Small pest detection is improved by adding large-scale feature maps and more residual units to a shallow network. Machine vision and deep learning create an end-to-end model to detect smallsmall pests on sticky traps in field conditions. Abstract. Pest detection is the basis of precise control in vegetable greenhouses. To improve the detection accuracy and robustness of two common small pests in greenhouses, whitefly and thrips, this study proposes a novel small object detection approach based on the YOLOv4 model. Yellow sticky trap (YST) images at the original resolution (2560x1920 pixels) were collected using a pest monitoring equipment in a greenhouse. They were then cropped and labeled to create the sub-images (416x416 pixels) to construct an experimental dataset. The labeled images of this study (900 training, 100 validation, and 200 test) are available for comparative studies. To enhance the model‘s ability to detect small pests, the feature map at the 8-fold downsampling layer in the backbone network was merged with the feature map at the 4-fold downsampling layer to generate a new layer and output a feature map with a size of 104x104 pixels. Furthermore, the residual units in the first two residual blocks are enlarged by four times to extract more shallow image features and the location information of target pests to withstand image degradation in the field. The experimental results show that the detection mAP of whitefly and thrips using the proposed approach is improved by 8.2% and 3.4% compared with the YOLOv3 and YOLOv4 models, respectively. The detection performance slightly decreases as the pest densities increase in the YST image, but the mAP value was still 92.7% in the high-density dataset, which indicates that the proposed model has good robustness over a range of pest densities. Compared with some previous similar studies, the proposed method has better potential to monitor whitefly and thrips using YSTs in field conditions.\",\"PeriodicalId\":23120,\"journal\":{\"name\":\"Transactions of the ASABE\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Transactions of the ASABE\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://doi.org/10.13031/TRANS.14394\",\"RegionNum\":4,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"AGRICULTURAL ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Transactions of the ASABE","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.13031/TRANS.14394","RegionNum":4,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
Using an Improved YOLOv4 Deep Learning Network for Accurate Detection of Whitefly and Thrips on Sticky Trap Images
Highlights The proposed method detected thrips and whitefly more accurately than previous methods. The proposed method demonstrated good robustness to illumination reflections and different pest densities. Small pest detection is improved by adding large-scale feature maps and more residual units to a shallow network. Machine vision and deep learning create an end-to-end model to detect smallsmall pests on sticky traps in field conditions. Abstract. Pest detection is the basis of precise control in vegetable greenhouses. To improve the detection accuracy and robustness of two common small pests in greenhouses, whitefly and thrips, this study proposes a novel small object detection approach based on the YOLOv4 model. Yellow sticky trap (YST) images at the original resolution (2560x1920 pixels) were collected using a pest monitoring equipment in a greenhouse. They were then cropped and labeled to create the sub-images (416x416 pixels) to construct an experimental dataset. The labeled images of this study (900 training, 100 validation, and 200 test) are available for comparative studies. To enhance the model‘s ability to detect small pests, the feature map at the 8-fold downsampling layer in the backbone network was merged with the feature map at the 4-fold downsampling layer to generate a new layer and output a feature map with a size of 104x104 pixels. Furthermore, the residual units in the first two residual blocks are enlarged by four times to extract more shallow image features and the location information of target pests to withstand image degradation in the field. The experimental results show that the detection mAP of whitefly and thrips using the proposed approach is improved by 8.2% and 3.4% compared with the YOLOv3 and YOLOv4 models, respectively. The detection performance slightly decreases as the pest densities increase in the YST image, but the mAP value was still 92.7% in the high-density dataset, which indicates that the proposed model has good robustness over a range of pest densities. Compared with some previous similar studies, the proposed method has better potential to monitor whitefly and thrips using YSTs in field conditions.
期刊介绍:
This peer-reviewed journal publishes research that advances the engineering of agricultural, food, and biological systems. Submissions must include original data, analysis or design, or synthesis of existing information; research information for the improvement of education, design, construction, or manufacturing practice; or significant and convincing evidence that confirms and strengthens the findings of others or that revises ideas or challenges accepted theory.