Biochar colloids (BCs) have attracted much attention globally, and their fate and transport in the subsurface are significantly influenced by soil-dissolved organic matter (DOM) in soil. This study utilized a real-time and non-invasive visualization system to reveal the transport and retention behavior of BCs in the presence of DOM in two-dimensional porous media. Results indicated that the presence of DOM enhanced the transport of BCs, due to its increased negative charge density and increased repulsion between BCs. The change in the particle size of the porous medium was also shown to affect the BCs transport in the porous media. Additionally, the negative charge of BCs shielded by high IS, the mobility of BCs decreased by 30.37 % from 1 mM to 50 mM. When the pH was increased from 5 to 9, the oxygen-containing functional groups of BCs and DOM were dissociated, and the mobility of BCs increased by 32.41 %. Through a simplified Double-Monod model, we fitted the breakthrough curves for BCs transport in porous media (R2 > 0.94). Moreover, the mechanism of different conditions on colloid clogging behavior was further elucidated through the DLVO theory. These findings extend the understanding of the environmental behavior of BCs in the presence of DOM derived from soil, enabling us to assess better and predict their environmental risks.
Detritus is the vital support for the microbial food web, which would further affect river ecological conditions. Determining the effects of detritus availability on microbial food webs in rivers is critical for protecting river ecological functions. However, the detritus availability was difficult to estimate directly, since the detritus transformation processes (i.e. detritus availability) and flow-induced transport processes are interdependent in rivers. Therefore, this study quantified the detritus transformation processes in a natural river and further identified the impacts of detritus availability on microbial food web patterns. Results revealed that the flow velocity was the main physical driver determining the detritus availability. The decreased velocity would promote detritus availability. Moreover, the increased detritus availability significantly promoted the diversity of bacteria, protozoan and metazoan (p < 0.05). The responses of low trophic levels to detritus availability were significantly greater than those of higher trophic levels, emphasizing the bottom-up cascading effect of detritus availability on microbial food web composition (p < 0.05). From microbial food web perspectives, the detritus availability was amplified with flow velocity decreased, promoting trophic transfer efficiency between different trophic levels. Results and findings revealed the ecological effect of detritus transformation processes on multi-trophic levels in rivers and provided advantageous information for river management.
This study proposes a two-step probabilistic post-processing approach that combines different machine learning-based postprocessors through the Copula-Embedded Bayesian Model Averaging (COP-BMA) method to improve the performance of a hydrological model for streamflow predictions. The proposed approach serves a two-fold purpose: firstly, it aims to enhance the accuracy of streamflow predictions, and secondly, it provides probabilistic results that implicitly address the structural uncertainty inherent in different postprocessing methods. We validate our approach by applying it to the Conceptual Functional Equivalent, a lumped hydrologic model utilized for simulating extreme floods during Hurricane Harvey. The validation is conducted across twelve distinct watersheds in the Southeast Texas region at both daily and monthly scales. The findings indicate that the proposed framework significantly enhances the performance of the hydrologic model across the studied watershed. Specifically, on a daily time scale, there is a 23% and 53% improvement in the NSE and KGE respectively, while on a monthly time scale, the framework enhances NSE by 21% and KGE by 25%. Additionally, the MAE (cms) was notably reduced from 4.64 to 2.23 on the daily scale, and from 2.8 to 1.65 on the monthly scale.
This paper introduces a novel approach for hyperparameter optimization of long short-term memory networks (LSTMs) to achieve highly accurate hourly streamflow and water level predictions in the realm of regional rainfall-runoff modeling. Leveraging simultaneous systematic hyperparameter optimization of 10 distinct hyperparameters by Random Search, the study achieves high accuracy in terms of predictions across 40 humid flashy catchments in Basque Country, north of Spain. By carefully designing the search space and incorporating domain expertise, the approach quickly converges to optimal and highly accurate network configurations with both efficiency and efficacy. LSTMs ingested precipitation, temperature, and potential evapotranspiration as inputs to predict 2 targets of streamflow and water level, in an hourly timestep. On the test set, the optimized LSTM networks accurately predicted streamflow and water level with Nash-Sutcliffe (NSE) and Kling-Gupta (KGE) efficiencies as high as 0.97, in one of the catchments. Across all 40 studied catchments, the overall average NSE and KGE values for streamflow were 0.89 and 0.87, respectively; water level exhibited average NSE and KGE scores of 0.91 and 0.92.
Moreover, statistical analysis reveals significant differences in the performance of the 2 distinct optimized network architectures in different hydrological catchments, underscoring the importance of deliberate network configuration selection post-random search. This selection process is vital for achieving higher performance in as many catchments as possible. The findings highlight opportunities for enhancing the “learning maturity” of regional hydrological deep learning LSTM networks. This research provides valuable insights for researchers and practitioners involved in optimizing regional hydrological deep learning models for a variety of applications and on new datasets.
The urban waterlogging stemming from the rainstorm exerts a considerable adverse influence on road networks. Accurately evaluating the vulnerability of road networks during waterlogging is a crucial measure to alleviate flood risk and damage. Nevertheless, previous research merely considered the maximum inundation depth of waterlogging, but neglecting the dynamic variation of waterlogging. In this study, we firstly develop a coupled hydrodynamic model (IFMS-Urban + SWMM) and integrated it with a road traffic model to analyze the impact of waterlogging on road networks; then, a time factor is involved to assess the dynamic variation of flood vulnerability by using a novel index road vulnerability recovery index (RVRI); finally, the mitigatory effect of low impact development infrastructure/practices (LIDs) on the vulnerability of transportation roads under waterlogging was investigated. The results show that, both the coupled hydrodynamic model and road traffic model were verified for logical soundness and show strong adaptability. The urban waterlogging becomes increasingly severe with the increase of return period of rainstorms and leads to a corresponding rise in the vulnerability of the road network. The evaluation system of road network vulnerability becomes more comprehensive and reasonable with the consideration of the time factor. The well-arranged LIDs can effectively reduce the vulnerability of road network. This study serves as a valuable reference for dynamically assessing the vulnerability of road networks to urban waterlogging, offering guidance for urban waterlogging prevention, road traffic management, and road operation and maintenance in urbanized areas.
To investigate the feasibility of using temperature for tracking rainfall-runoff processes in karst catchments, this study developed a tracer-aided conceptual model using temperature as a tracer by coupling water and heat transport processes at the catchment scale. The model was calibrated and validated using hourly hydrometeorological and temperature data from a 1.25 km2 karst catchment in south-western China. The results showed that the model was able to capture the water flux and temperature dynamics of different landscape units in the karst catchment. Utilizing this framework, the model delineated the flux age distribution within different landscape units, as well as the overall water transit times through the catchment. The average flux ages were determined to be approximately 80 days for the hillslope unit, 452 days for the slow flow system, and 260 days for the fast flow regime within the depression areas. These estimations align broadly with those acquired using stable isotopes as tracers. Comparative analysis revealed that the flux age distributions derived from both temperature and isotopic tracers exhibited analogous patterns at the catchment outlet and across the hillslope compartments. However, the simulations based on temperature hinted at a heightened proportion of exceedingly young and decidedly old water in the outflow, alluding to a potential overestimation of these extreme age classes by the temperature-tracer model. From the temperature-simulated transit time distribution, about 31 % of the precipitation entering during the study period have left the catchment within 3 years, and a notable proportion of rain water was either stored in the aquifer or lost through evapotranspiration. The general characteristics of the transit time distribution simulated using temperature was similar with that simulated using isotopes, though a higher proportion of precipitation being drained by fast flows was inferred from the transit time distribution simulated using temperature. Collectively, our study demonstrated that temperature can serve as a cost-effective tracer for modelling of water age distributions and associated hydrological processes in karst catchments.
This study employed an operational monitoring network to measure soil moisture and runoff behaviour continuously in the Alpine catchment Geroldsbach-Götzens, Austria. We hypothesize that afforestation can have a positive impact on soil water buffering. To analyse the impact of soil properties and vegetation cover changes on soil water dynamics, four experimental plots were established on grassland and monitoring stations were installed in the forest. The rainfall test site is equipped with an automatic weather station to obtain meteorological observations, and weirs to measure surface runoff of natural occurring precipitation events and artificial rainfall simulations. In the plots, 200 soil moisture sensors were installed at five different depths, aimed to track and visualize infiltration and subsurface flow processes. Another twenty sensors monitored soil moisture at different afforestation stages in the forested part of the catchment. The measurements show that soils covered with young and old-growth forest have a higher and more stable soil moisture content than grassland and soils with a lack of vegetation throughout the seasons. We observed large spatial differences at plot scale, where the spatial variability of soil moisture increases with depth and is highest during convective precipitation. The initial conditions and rainfall characteristics play an important role in infiltration processes and soil water storage. Our rainfall test site demonstrated the challenges of innovative monitoring techniques and that it offers opportunities for more experiments to gather evidence-based data as input for flood models. Overall findings confirm the sponge effect of forest soils and indicate that afforestation as Nature-Based Solution reduces the temporal soil moisture variability, buffering soil water during precipitation events, which can be beneficial for runoff reduction in Alpine catchments.