{"title":"田间条件下基于立体视觉和回归卷积神经网络的荞麦株高估计","authors":"Jianlong Zhang, Wenwen Xing, Xuefeng Song, Yulong Cui, Wang Li, Decong Zheng","doi":"10.3390/agronomy13092312","DOIUrl":null,"url":null,"abstract":"Buckwheat plant height is an important indicator for producers. Due to the decline in agricultural labor, the automatic and real-time acquisition of crop growth information will become a prominent issue for farms in the future. To address this problem, we focused on stereo vision and a regression convolutional neural network (CNN) in order to estimate buckwheat plant height. MobileNet V3 Small, NasNet Mobile, RegNet Y002, EfficientNet V2 B0, MobileNet V3 Large, NasNet Large, RegNet Y008, and EfficientNet V2 L were modified into regression CNNs. Through a five-fold cross-validation of the modeling data, the modified RegNet Y008 was selected as the optimal estimation model. Based on the depth and contour information of buckwheat depth image, the mean absolute error (MAE), root mean square error (RMSE), mean square error (MSE), and mean relative error (MRE) when estimating plant height were 0.56 cm, 0.73 cm, 0.54 cm, and 1.7%, respectively. The coefficient of determination (R2) value between the estimated and measured results was 0.9994. Combined with the LabVIEW software development platform, this method can estimate buckwheat accurately, quickly, and automatically. This work contributes to the automatic management of farms.","PeriodicalId":56066,"journal":{"name":"Agronomy-Basel","volume":" ","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Buckwheat Plant Height Estimation Based on Stereo Vision and a Regression Convolutional Neural Network under Field Conditions\",\"authors\":\"Jianlong Zhang, Wenwen Xing, Xuefeng Song, Yulong Cui, Wang Li, Decong Zheng\",\"doi\":\"10.3390/agronomy13092312\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Buckwheat plant height is an important indicator for producers. Due to the decline in agricultural labor, the automatic and real-time acquisition of crop growth information will become a prominent issue for farms in the future. To address this problem, we focused on stereo vision and a regression convolutional neural network (CNN) in order to estimate buckwheat plant height. MobileNet V3 Small, NasNet Mobile, RegNet Y002, EfficientNet V2 B0, MobileNet V3 Large, NasNet Large, RegNet Y008, and EfficientNet V2 L were modified into regression CNNs. Through a five-fold cross-validation of the modeling data, the modified RegNet Y008 was selected as the optimal estimation model. Based on the depth and contour information of buckwheat depth image, the mean absolute error (MAE), root mean square error (RMSE), mean square error (MSE), and mean relative error (MRE) when estimating plant height were 0.56 cm, 0.73 cm, 0.54 cm, and 1.7%, respectively. The coefficient of determination (R2) value between the estimated and measured results was 0.9994. Combined with the LabVIEW software development platform, this method can estimate buckwheat accurately, quickly, and automatically. This work contributes to the automatic management of farms.\",\"PeriodicalId\":56066,\"journal\":{\"name\":\"Agronomy-Basel\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Agronomy-Basel\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://doi.org/10.3390/agronomy13092312\",\"RegionNum\":2,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRONOMY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Agronomy-Basel","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.3390/agronomy13092312","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRONOMY","Score":null,"Total":0}
Buckwheat Plant Height Estimation Based on Stereo Vision and a Regression Convolutional Neural Network under Field Conditions
Buckwheat plant height is an important indicator for producers. Due to the decline in agricultural labor, the automatic and real-time acquisition of crop growth information will become a prominent issue for farms in the future. To address this problem, we focused on stereo vision and a regression convolutional neural network (CNN) in order to estimate buckwheat plant height. MobileNet V3 Small, NasNet Mobile, RegNet Y002, EfficientNet V2 B0, MobileNet V3 Large, NasNet Large, RegNet Y008, and EfficientNet V2 L were modified into regression CNNs. Through a five-fold cross-validation of the modeling data, the modified RegNet Y008 was selected as the optimal estimation model. Based on the depth and contour information of buckwheat depth image, the mean absolute error (MAE), root mean square error (RMSE), mean square error (MSE), and mean relative error (MRE) when estimating plant height were 0.56 cm, 0.73 cm, 0.54 cm, and 1.7%, respectively. The coefficient of determination (R2) value between the estimated and measured results was 0.9994. Combined with the LabVIEW software development platform, this method can estimate buckwheat accurately, quickly, and automatically. This work contributes to the automatic management of farms.
Agronomy-BaselAgricultural and Biological Sciences-Agronomy and Crop Science
CiteScore
6.20
自引率
13.50%
发文量
2665
审稿时长
20.32 days
期刊介绍:
Agronomy (ISSN 2073-4395) is an international and cross-disciplinary scholarly journal on agronomy and agroecology. It publishes reviews, regular research papers, communications and short notes, and there is no restriction on the length of the papers. Our aim is to encourage scientists to publish their experimental and theoretical research in as much detail as possible. Full experimental and/or methodical details must be provided for research articles.