Accurate Leaf Area Index (LAI) estimations at the soybean plot scale is achievable using high-resolution Unmanned Aerial Vehicle (UAV) imagery and field measurement samples. However, the limited coverage of UAV flights restricts large-scale remote sensing monitoring in expansive soybean fields. This study leverages the broad coverage and 3-m resolution of PlanetScope satellite imagery to extend LAI prediction from UAV to satellite scales through transfer learning, using UAV-scale LAI estimates as a benchmark to validate cross-scale consistency. To address this challenge, this study proposed the LAI-TransNet, a two-stage transfer learning framework designed for precise and scalable soybean LAI prediction across large areas, demonstrating its effectiveness in cross-scale monitoring. In Stage 1, a UAV-scale benchmark is established using PROSAIL-simulated UAV reflectance data (UAV-Sim) and field-measured soybean LAI. Traditional machine learning, deep learning, and transfer learning models are trained on a hybrid UAV-Sim and field-measured dataset (UAV-Sim_Measured), with the transfer learning model CNN-TL, fine-tuned using pre-trained weights derived from UAV-Sim, achieving the highest accuracy (R2 = 0.81, RMSE = 0.64 m2/m2, rRMSE = 11.5 %). In Stage 2, LAI-TransNet is developed by fine-tuning the CNN-TL model on PlanetScope simulated data (PS-Sim), preprocessed via cross-domain mapping to align UAV and satellite spectral features. Real PlanetScope imagery is corrected for reflectance consistency with reference to UAV imagery spectral profiles. LAI-TransNet outperforms other deep learning models trained directly on PS-Sim (R2 = 0.69 vs. 0.60–0.63), ensuring robust cross-scale consistency. By bridging UAV and satellite scales, LAI-TransNet enables large-scale soybean LAI monitoring, enhancing precision agriculture management through improved monitoring with the PlanetScope imagery.
扫码关注我们
求助内容:
应助结果提醒方式:
