Daley et al. (2023a) argue that at least 10–15 years apart Digital Elevation Model (DEM) derived DEMs of Difference (DoD) surveys are needed to detect reliable geomorphic change within the gullied landscapes of the Great Barrier Reef, Australia. We acknowledge that the reliability of observed geomorphic change increases as more subtle geomorphic processes are detected with longer monitoring periods. As further good quality long-term legacy datasets become available, we encourage utilising these to improve confidence in targeting erosion rehabilitation. However, our approach to consistently apply 2–3 year DoDs to contrasting gully morphologies enabled capture of more intense geomorphic processes acting over shorter timeframes and provided valuable and timely information on (i) contrasting erosional mechanisms and erosion rates between variable gully morphologies, and (ii) rehabilitation efforts undertaken. In this paper, we take the opportunity to concisely address all the concerns raised by Daley et al. (2023a).
Soil loss by water erosion is one of the main threats to soil health and food production in intensively used agricultural areas. To assess its significance to overall sediment production, we applied the Water and Tillage Erosion Model/Sediment Delivery model (WaTEM/SEDEM) to the Luoyugou catchment, a subcatchment of the Yellow River Basin within the Chinese Loess Plateau. WaTEM/SEDEM considers rill and interrill erosion and deposition rates to calculate the sediment yield rates leaving the catchment. Terraces were established in the 1990s to reduce soil loss in this area, but no soil erosion modeling has been published regarding the effect of this mitigation measure. Therefore, we applied 1000 Monte Carlo simulations of the WaTEM/SEDEM, and the modeled average soil loss by rill and interrill erosion for 2020 was 12.2 ± 0.5 t ha−1 yr−1, with a sediment yield at the outlet of 53,207.8 ± 11,244.1 t yr−1. The results indicated that the terracing reduced gross soil loss rates (from 51.8 t ha−1 yr−1 in 1986 to 12.2 ± 0.5 t ha−1 yr−1 in 2020), while land cover changes, mainly the conversion of forests and grassland, partly counteracted the mitigation (combined effect: 76% reduction). Modeled sediment loads by rill and interrill erosion accounted for 22.8% of the total long-term sediment production recorded by flow discharge measurements. Other processes not considered by the model, such as landslides, gully erosion, riverbank erosion, and sediment production by construction, seem to predominantly influence the overall sediment yield. Considering years with baseline sediment production only, the measured and modeled sediment yields compared favorably, indicating that the latter processes primarily contribute during extreme events.
Current watershed-scale, nonpoint source pollution models do not represent the processes and impacts of agricultural best management practices on water quality with sufficient detail. A Water Erosion Prediction Project-Water Quality (WEPP-WQ) model was recently developed which is capable of simulating nonpoint source pollutant transport in nonuniform hillslope conditions such as those with BMPs. However, WEPP-WQ has not been validated for these conditions, and prior validation work only evaluated calibrated performance rather than uncalibrated performance, with the latter being most relevant to model applications. This study evaluated uncalibrated and calibrated model performance in two plot-scale, artificial rainfall studies. 179 observations were compared to corresponding WEPP-WQ simulations of runoff, sediment yield, and soluble and particulate nutrient forms for both nitrogen and phosphorus. Uncalibrated validation results were mixed for the different field conditions, model configurations, and prediction variables. Nash-Sutcliffe Efficiencies for uncalibrated simulations of uniform conditions were generally greater than 0.6 except for soluble nitrogen predictions which were poor. Simulations of nonuniform conditions were generally ‘unsatisfactory’ except for runoff predictions which were quite good (NSE = 0.78). Performance was improved substantially for almost all endpoints with calibration. Some exceptions to this occurred because the objective function for calibration was based on log-space differences so as to more equally-weight calibration of unsaturated conditions that tend to produce lesser runoff volumes and sediment yields. Calibrated results for both uniform and nonuniform conditions were generally ‘satisfactory’ or ‘good’ according to widely accepted model performance criteria.
Conservation tillage is an important conservation measure for arable land in modern agricultural production, which plays an essential role in protecting black soil and improving the quality of arable land. The estimation of maize residue cover (MRC) can be used to obtain the spatial distribution characteristics of conservation tillage, which is essential for government departments to promote conservation tillage technology and understand the implementation of it. In this paper the southern part of the Songnen Plain was used as the study area, and Sentinel-2 MSI images and Sentinel-1 SAR images were used as data sources to correlate the spectral indices and radar backscatter coefficients with the field sampling data in the study area. The MRC estimation model of the study area was constructed using the Random Forest (RF) model, the Multiple Linear Stepwise Regression (MLSR) model, and Back Propagation Neural Network (BPNN) model, respectively. The results of the study showed that the correlation coefficients of normalized difference tillage index (NDTI), simple tillage index (STI), normalized difference index (NDI5), NDI7, shortwave infrared normalized difference residue index (SINDRI), normalized difference senescent vegetation index (NDSVI), normalized difference residue index 2 (NDRI2), NDRI3, NDRI4, NDRI5, NDRI6, NDRI7, NDRI8, NDRI9, and MRC in the study area were greater than 0.4, and the correlation coefficients were higher for NDTI and STI, which reached 0.861 and 0.860, respectively. The correlation coefficient between VV and MRC was 0.56 and between VH and MRC was 0.594. We used MLSR, RF, and BPNN methods in combination with Sentinel-2 MSI images and Sentinel-1 SAR images for MRC estimation. The synergistic use of Sentinel-2 MSI images and Sentinel-1 SAR images helped to improve the accuracy of the MRC estimation models and the correlation coefficient R2 of all three models to greater than 0.8. Based on the statistical analysis of remote sensing estimation results, we found that the average value of the MRC of the maize growing areas in Changchun, Siping, and eastern Songyuan in November 2020 was 66%, and 2% of farmland in the study area had a MRC of less than 30%.
Global climate change and overgrazing are driving shifts in the plant composition of grassland communities, which may profoundly affect the function of grassland ecosystems in regulating runoff and soil erosion. Here, we examined the shift effects of normal hillslope alpine meadow to shrub and severely degraded meadow states on runoff and sediment generation under natural rainfall conditions, and determined the contributions of plant and soil properties changes to soil erodibility, runoff and sediment generation by in situ rainfall experiment and monitoring on the hillslope of Qinghai-Tibetan Plateau. The results showed that normal meadow shift into severely degraded meadow state, mean weight diameter, soil saturated hydraulic conductivity, soil cohesion and soil erodibility K-factor at the topsoil decreased by 70.3%, 73.1%, 80.3% and −13.1%, respectively, and when normal meadows shift into shrub meadow state, they reduced by 49.1%, −1.3%, 49.4%, and −8.3%, respectively. Runoff and soil loss significantly changed by - 40.0% and 177.8% when normal meadow shifted into a severely degraded meadow state, while runoff and soil loss significantly changed by + 65.0% and +77.8% when normal meadow shifted into a shrub meadow state. Our findings highlight that the two divergent shifts both increased soil loss compared to the normal hillslope alpine meadows. Overall, our results indicate that the divergent shifts of normal alpine meadows exacerbated soil erodibility and soil loss of hillslope alpine meadows. These results obtained here offer a novel perspective on the regulation of runoff and soil erosion in the alpine meadow ecosystem.
Machine learning (ML) is becoming an ever more important tool in hydrologic modeling. Previous studies have shown the higher prediction accuracy of those ML models over traditional process-based ones. However, there is another advantage of ML which is its lower computational demand. This is important for the applications such as hydraulic soil erosion estimation over a large area and at a finer spatial scale. Using traditional models like Rangeland Hydrology and Erosion Model (RHEM) requires too much computation time and resources. In this study, we designed an Artificial Neural Network that is able to recreate the RHEM outputs (annual average runoff, soil loss, and sediment yield and not the daily storm event-based values) with high accuracy (Nash-Sutcliffe Efficiency ≈ 1.0) and a very low computational time (13 billion times faster on average using a GPU). We ran the RHEM for more than a million synthetic scenarios and train the Emulator with them. We also, fine-tuned the trained Emulator with the RHEM runs of the real-world scenarios (more than 32,000) so the Emulator remains comprehensive while it works specifically accurately for the real-world cases. We also showed that the sensitivity of the Emulator to the input variables is similar to the RHEM and it can effectively capture the changes in the RHEM outputs when an input variable varies. Finally, the dynamic prediction behavior of the Emulator is statistically similar to the RHEM.