Shufflenetv2UNet: An improved neural network model for grassland sample coverage extraction

IF 2.7 3区 农林科学 Q1 AGRONOMY Grass and Forage Science Pub Date : 2024-11-11 DOI:10.1111/gfs.12697
Yunyu Liu, Tonghai Liu, Fanzhen Wang, Hongxiao Shi, Hai Wang, Bagen Hasi, Fangyu Gao, Changqin Liu, Hua Li
{"title":"Shufflenetv2UNet: An improved neural network model for grassland sample coverage extraction","authors":"Yunyu Liu,&nbsp;Tonghai Liu,&nbsp;Fanzhen Wang,&nbsp;Hongxiao Shi,&nbsp;Hai Wang,&nbsp;Bagen Hasi,&nbsp;Fangyu Gao,&nbsp;Changqin Liu,&nbsp;Hua Li","doi":"10.1111/gfs.12697","DOIUrl":null,"url":null,"abstract":"<p>Accurate extraction of grassland sample coverage is crucial for regional ecological environment monitoring. Due to the strong feature learning capability, high flexibility, and scalability of deep learning methods, they have great potential in grassland sample extraction modelling. However, we still lack a model that can achieve both lightweight structure and effective performance for small object segmentation to considering the small target characteristics of grassland vegetation and the requirements for model deployment in later stages. Here, we combined the UNet model, which performs well in small target segmentation, with the lightweight network Shufflenetv2 model, proposing an improved UNet neural network, Shufflenetv2UNet, for grassland sample coverage extraction. The core of Shufflenetv2UNet is the removal of maximum pooling and double-layer convolution modules from downsampling in the UNet neural network. In addition, the Inverted Residual Block structure module from Shufflenetv2 was added to achieve a lightweight model and improved extraction accuracy. The Shufflenetv2UNet achieves an accuracy of 98.23%, with a parameter size of 50.74 M, and a model inference speed of 0.004 s. Compared to existing extraction methods, this model has advantages in prediction accuracy, parameter size, and model inference speed. Moreover, Shufflenetv2UNet achieved different types of grassland sample coverage extractions, with good robustness, generalization, and universality, enabling investigators to quickly and accurately obtain grassland sample coverage. This allows more dynamic and accurate ground measurement data for regional grassland environmental monitoring.</p>","PeriodicalId":12767,"journal":{"name":"Grass and Forage Science","volume":"79 4","pages":"516-529"},"PeriodicalIF":2.7000,"publicationDate":"2024-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Grass and Forage Science","FirstCategoryId":"97","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/gfs.12697","RegionNum":3,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRONOMY","Score":null,"Total":0}
引用次数: 0

Abstract

Accurate extraction of grassland sample coverage is crucial for regional ecological environment monitoring. Due to the strong feature learning capability, high flexibility, and scalability of deep learning methods, they have great potential in grassland sample extraction modelling. However, we still lack a model that can achieve both lightweight structure and effective performance for small object segmentation to considering the small target characteristics of grassland vegetation and the requirements for model deployment in later stages. Here, we combined the UNet model, which performs well in small target segmentation, with the lightweight network Shufflenetv2 model, proposing an improved UNet neural network, Shufflenetv2UNet, for grassland sample coverage extraction. The core of Shufflenetv2UNet is the removal of maximum pooling and double-layer convolution modules from downsampling in the UNet neural network. In addition, the Inverted Residual Block structure module from Shufflenetv2 was added to achieve a lightweight model and improved extraction accuracy. The Shufflenetv2UNet achieves an accuracy of 98.23%, with a parameter size of 50.74 M, and a model inference speed of 0.004 s. Compared to existing extraction methods, this model has advantages in prediction accuracy, parameter size, and model inference speed. Moreover, Shufflenetv2UNet achieved different types of grassland sample coverage extractions, with good robustness, generalization, and universality, enabling investigators to quickly and accurately obtain grassland sample coverage. This allows more dynamic and accurate ground measurement data for regional grassland environmental monitoring.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Shufflenetv2UNet:用于草原样本覆盖率提取的改进型神经网络模型
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Grass and Forage Science
Grass and Forage Science 农林科学-农艺学
CiteScore
5.10
自引率
8.30%
发文量
37
审稿时长
12 months
期刊介绍: Grass and Forage Science is a major English language journal that publishes the results of research and development in all aspects of grass and forage production, management and utilization; reviews of the state of knowledge on relevant topics; and book reviews. Authors are also invited to submit papers on non-agricultural aspects of grassland management such as recreational and amenity use and the environmental implications of all grassland systems. The Journal considers papers from all climatic zones.
期刊最新文献
Issue Information Editorial Sward Species Diversity Impacts on Pasture Productivity and Botanical Composition Under Grazing Systems Survival of 13 Forage Legumes in Contrasting Environments of Central Otago, New Zealand Editorial: Special issue on green biorefining
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1