{"title":"基于无人机传感的荔枝分割技术(使用改进的掩模-RCNN),用于精准农业","authors":"Bhabesh Deka;Debarun Chakraborty","doi":"10.1109/TAFE.2024.3420028","DOIUrl":null,"url":null,"abstract":"Traditional methods of manual litchi fruit counting are labor-intensive, time-consuming, and prone to errors. Moreover, due to its complex growth structures, such as occlusion with leaves and branches, overlapping, and uneven color, it becomes more challenging for the current baseline detection and instance segmentation models to accurately identify the litchi fruits. The advancement of deep learning architecture and modern sensing technology, such as unmanned aerial vehicle (UAV), had shown great potential for improving fruit counting accuracy and efficiency. In this article, we propose a modified Mask-region-based convolutional neural network-based instance segmentation model using channel attention to detect and count litchis in complex natural environments using UAV. In addition, we build a UAV-Litchi dataset consisting of 1000 images with 31 000 litchi annotations, collected by the DJI Phantom 4 with RGB sensor and labeled with a LabelImg annotation tool. Experimental results show that the proposed model with the squeeze-and-excitation block improves the detection accuracy of litchi fruits, achieving a mean average precision, recall, and F1 score of 81.47%, 92.81%, and 88.40%, respectively, with an average inference time of 7.72 s. The high accuracy and efficiency of the proposed model demonstrate its potential for precise and accurate litchi detection in precision agriculture.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"509-517"},"PeriodicalIF":0.0000,"publicationDate":"2024-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"UAV Sensing-Based Litchi Segmentation Using Modified Mask-RCNN for Precision Agriculture\",\"authors\":\"Bhabesh Deka;Debarun Chakraborty\",\"doi\":\"10.1109/TAFE.2024.3420028\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Traditional methods of manual litchi fruit counting are labor-intensive, time-consuming, and prone to errors. Moreover, due to its complex growth structures, such as occlusion with leaves and branches, overlapping, and uneven color, it becomes more challenging for the current baseline detection and instance segmentation models to accurately identify the litchi fruits. The advancement of deep learning architecture and modern sensing technology, such as unmanned aerial vehicle (UAV), had shown great potential for improving fruit counting accuracy and efficiency. In this article, we propose a modified Mask-region-based convolutional neural network-based instance segmentation model using channel attention to detect and count litchis in complex natural environments using UAV. In addition, we build a UAV-Litchi dataset consisting of 1000 images with 31 000 litchi annotations, collected by the DJI Phantom 4 with RGB sensor and labeled with a LabelImg annotation tool. Experimental results show that the proposed model with the squeeze-and-excitation block improves the detection accuracy of litchi fruits, achieving a mean average precision, recall, and F1 score of 81.47%, 92.81%, and 88.40%, respectively, with an average inference time of 7.72 s. The high accuracy and efficiency of the proposed model demonstrate its potential for precise and accurate litchi detection in precision agriculture.\",\"PeriodicalId\":100637,\"journal\":{\"name\":\"IEEE Transactions on AgriFood Electronics\",\"volume\":\"2 2\",\"pages\":\"509-517\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on AgriFood Electronics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10596987/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on AgriFood Electronics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10596987/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
UAV Sensing-Based Litchi Segmentation Using Modified Mask-RCNN for Precision Agriculture
Traditional methods of manual litchi fruit counting are labor-intensive, time-consuming, and prone to errors. Moreover, due to its complex growth structures, such as occlusion with leaves and branches, overlapping, and uneven color, it becomes more challenging for the current baseline detection and instance segmentation models to accurately identify the litchi fruits. The advancement of deep learning architecture and modern sensing technology, such as unmanned aerial vehicle (UAV), had shown great potential for improving fruit counting accuracy and efficiency. In this article, we propose a modified Mask-region-based convolutional neural network-based instance segmentation model using channel attention to detect and count litchis in complex natural environments using UAV. In addition, we build a UAV-Litchi dataset consisting of 1000 images with 31 000 litchi annotations, collected by the DJI Phantom 4 with RGB sensor and labeled with a LabelImg annotation tool. Experimental results show that the proposed model with the squeeze-and-excitation block improves the detection accuracy of litchi fruits, achieving a mean average precision, recall, and F1 score of 81.47%, 92.81%, and 88.40%, respectively, with an average inference time of 7.72 s. The high accuracy and efficiency of the proposed model demonstrate its potential for precise and accurate litchi detection in precision agriculture.