Unmanned aerial systems (UAS) offer significant potential to improve agricultural practices due to their multi-modal payload capacity, ease of deployment, and lower cost. However, there is a need to expand UAS capabilities by including root crops, offering robust, growth-stage-independent models, and providing a comprehensive assessment of various imaging systems, i.e., identifying application-specific sensing modalities. This study aims to tackle those challenges by presenting a unified Gaussian Process Regression (GPR) model for predicting end-of-season table beet (a subterranean root crop) yield using UAS-derived spectral and structural features, combined with meteorological data, while remaining robust to flight and harvest timing. Field trials were conducted at Cornell AgriTech in Geneva, NY during the 2021 and 2022 growing seasons. UAS flights captured five-band (475, 560, 668, 717, and 840 nm) multispectral imagery, hyperspectral imagery (400–1000 nm), and light detection and ranging (LiDAR) data at multiple times throughout the season. Our model achieved R2test = 0.81 and MAPEtest = 15.7 % using only multispectral imagery, while the hyperspectral + LiDAR model attained R2test = 0.79 and MAPEtest = 17.4 %, which is comparable to recent root yield modeling studies using UAS data. Shapely analysis was performed to gain further insight into model behavior. This analysis revealed canopy volume information to contain high relative importance, as compared to other features, for table beet root yield estimation. Our study demonstrated that UAS-based imaging, combined with a unified machine learning model, can effectively predict root crop yield, providing a scalable and transferable approach for precision agriculture.
扫码关注我们
求助内容:
应助结果提醒方式:
