求助PDF
{"title":"FG-UNet:用于分割无人机图像中杂草和作物的细粒度特征引导 UNet","authors":"Jianwu Lin, Xin Zhang, Yongbin Qin, Shengxian Yang, Xingtian Wen, Tomislav Cernava, Xiaoyulong Chen","doi":"10.1002/ps.8489","DOIUrl":null,"url":null,"abstract":"BACKGROUNDSemantic segmentation of weed and crop images is a key component and prerequisite for automated weed management. For weeds in unmanned aerial vehicle (UAV) images, which are usually characterized by small size and easily confused with crops at early growth stages, existing semantic segmentation models have difficulties to extract sufficiently fine features. This leads to their limited performance in weed and crop segmentation of UAV images.RESULTSWe proposed a fine‐grained feature‐guided UNet, named FG‐UNet, for weed and crop segmentation in UAV images. Specifically, there are two branches in FG‐UNet, namely the fine‐grained feature branch and the UNet branch. In the fine‐grained feature branch, a fine feature‐aware (FFA) module was designed to mine fine features in order to enhance the model's ability to segment small objects. In the UNet branch, we used an encoder–decoder structure to realize high‐level semantic feature extraction in images. In addition, a contextual feature fusion (CFF) module was designed for the fusion of the fine features and high‐level semantic features, thus enhancing the feature discrimination capability of the model. The experimental results showed that our proposed FG‐UNet, achieved state‐of‐the‐art performance compared to other semantic segmentation models, with mean intersection over union (MIOU) and mean pixel accuracy (MPA) of 88.06% and 92.37%, respectively.CONCLUSIONThe proposed method in this study lays a solid foundation for accurate detection and intelligent management of weeds. It will have a positive impact on the development of smart agriculture. © 2024 Society of Chemical Industry.","PeriodicalId":218,"journal":{"name":"Pest Management Science","volume":"67 1","pages":""},"PeriodicalIF":3.8000,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FG‐UNet: fine‐grained feature‐guided UNet for segmentation of weeds and crops in UAV images\",\"authors\":\"Jianwu Lin, Xin Zhang, Yongbin Qin, Shengxian Yang, Xingtian Wen, Tomislav Cernava, Xiaoyulong Chen\",\"doi\":\"10.1002/ps.8489\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"BACKGROUNDSemantic segmentation of weed and crop images is a key component and prerequisite for automated weed management. For weeds in unmanned aerial vehicle (UAV) images, which are usually characterized by small size and easily confused with crops at early growth stages, existing semantic segmentation models have difficulties to extract sufficiently fine features. This leads to their limited performance in weed and crop segmentation of UAV images.RESULTSWe proposed a fine‐grained feature‐guided UNet, named FG‐UNet, for weed and crop segmentation in UAV images. Specifically, there are two branches in FG‐UNet, namely the fine‐grained feature branch and the UNet branch. In the fine‐grained feature branch, a fine feature‐aware (FFA) module was designed to mine fine features in order to enhance the model's ability to segment small objects. In the UNet branch, we used an encoder–decoder structure to realize high‐level semantic feature extraction in images. In addition, a contextual feature fusion (CFF) module was designed for the fusion of the fine features and high‐level semantic features, thus enhancing the feature discrimination capability of the model. The experimental results showed that our proposed FG‐UNet, achieved state‐of‐the‐art performance compared to other semantic segmentation models, with mean intersection over union (MIOU) and mean pixel accuracy (MPA) of 88.06% and 92.37%, respectively.CONCLUSIONThe proposed method in this study lays a solid foundation for accurate detection and intelligent management of weeds. It will have a positive impact on the development of smart agriculture. © 2024 Society of Chemical Industry.\",\"PeriodicalId\":218,\"journal\":{\"name\":\"Pest Management Science\",\"volume\":\"67 1\",\"pages\":\"\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2024-10-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Pest Management Science\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://doi.org/10.1002/ps.8489\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRONOMY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pest Management Science","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.1002/ps.8489","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRONOMY","Score":null,"Total":0}
引用次数: 0
引用
批量引用
FG‐UNet: fine‐grained feature‐guided UNet for segmentation of weeds and crops in UAV images
BACKGROUNDSemantic segmentation of weed and crop images is a key component and prerequisite for automated weed management. For weeds in unmanned aerial vehicle (UAV) images, which are usually characterized by small size and easily confused with crops at early growth stages, existing semantic segmentation models have difficulties to extract sufficiently fine features. This leads to their limited performance in weed and crop segmentation of UAV images.RESULTSWe proposed a fine‐grained feature‐guided UNet, named FG‐UNet, for weed and crop segmentation in UAV images. Specifically, there are two branches in FG‐UNet, namely the fine‐grained feature branch and the UNet branch. In the fine‐grained feature branch, a fine feature‐aware (FFA) module was designed to mine fine features in order to enhance the model's ability to segment small objects. In the UNet branch, we used an encoder–decoder structure to realize high‐level semantic feature extraction in images. In addition, a contextual feature fusion (CFF) module was designed for the fusion of the fine features and high‐level semantic features, thus enhancing the feature discrimination capability of the model. The experimental results showed that our proposed FG‐UNet, achieved state‐of‐the‐art performance compared to other semantic segmentation models, with mean intersection over union (MIOU) and mean pixel accuracy (MPA) of 88.06% and 92.37%, respectively.CONCLUSIONThe proposed method in this study lays a solid foundation for accurate detection and intelligent management of weeds. It will have a positive impact on the development of smart agriculture. © 2024 Society of Chemical Industry.