Pub Date : 2024-11-12DOI: 10.1016/j.envsoft.2024.106268
Cesar Alvites , Hannah O'Sullivan , Saverio Francini , Marco Marchetti , Giovanni Santopuoli , Gherardo Chirici , Bruno Lasserre , Michela Marignani , Erika Bazzato
Spatially and temporally discontinuous canopy height footprints collected by NASA's GEDI (Global Ecosystem Dynamics Investigation) mission are accessible on the Google Earth Engine (GEE) cloud computing platform. This study introduces an open-source, user-friendly, code-free GEE web application called Canopy Height Mapper (CH-GEE), available at https://ee-calvites1990.projects.earthengine.app/view/ch-gee, which automatically generates (10 m) high-resolution canopy height maps for a specific area by integrating GEDI with multi-source remote sensing data: Copernicus and topographical data from the GEE data catalogue. CH-GEE generates local-to-country scale calibrated canopy height maps worldwide using machine learning algorithms and leveraging the GEE platform's big data and cloud computing capabilities. CH-GEE allows customization of geographic area, algorithms and time windows for GEDI and predictors. Canopy heights generated by CH-GEE were validated using the Italian National Forest Inventory across 110,000 km2 at multiple scales (Country-based R-squared = 0.89, RMSE = 17%). CH-GEE's accuracy and scalability make it suitable for forest monitoring.
{"title":"Canopy height Mapper: A google earth engine application for predicting global canopy heights combining GEDI with multi-source data","authors":"Cesar Alvites , Hannah O'Sullivan , Saverio Francini , Marco Marchetti , Giovanni Santopuoli , Gherardo Chirici , Bruno Lasserre , Michela Marignani , Erika Bazzato","doi":"10.1016/j.envsoft.2024.106268","DOIUrl":"10.1016/j.envsoft.2024.106268","url":null,"abstract":"<div><div>Spatially and temporally discontinuous canopy height footprints collected by NASA's GEDI (Global Ecosystem Dynamics Investigation) mission are accessible on the Google Earth Engine (GEE) cloud computing platform. This study introduces an open-source, user-friendly, code-free GEE web application called Canopy Height Mapper (CH-GEE), available at <span><span>https://ee-calvites1990.projects.earthengine.app/view/ch-gee</span><svg><path></path></svg></span>, which automatically generates (10 m) high-resolution canopy height maps for a specific area by integrating GEDI with multi-source remote sensing data: Copernicus and topographical data from the GEE data catalogue. CH-GEE generates local-to-country scale calibrated canopy height maps worldwide using machine learning algorithms and leveraging the GEE platform's big data and cloud computing capabilities. CH-GEE allows customization of geographic area, algorithms and time windows for GEDI and predictors. Canopy heights generated by CH-GEE were validated using the Italian National Forest Inventory across 110,000 km<sup>2</sup> at multiple scales (Country-based R-squared = 0.89, RMSE = 17%). CH-GEE's accuracy and scalability make it suitable for forest monitoring.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106268"},"PeriodicalIF":4.8,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142672813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-12DOI: 10.1016/j.envsoft.2024.106269
Nathan Bonham , Joseph Kasprzyk , Edith Zagona
Vulnerability analysis is an emerging technique that discovers concise descriptions of the conditions that lead to decision-relevant outcomes (i.e., scenarios) by applying machine learning methods to a large ensemble of simulation model runs. This review organizes vulnerability analysis methods into a taxonomy and compares them in terms of interpretability, flexibility, and accuracy. Our review contextualizes interpretability in terms of five purposes for vulnerability analysis, such as adaptation systems and choosing between policies. We make recommendations for designing a vulnerability analysis that is interpretable for a specific purpose. Furthermore, a numerical experiment demonstrates how methods can be compared based on interpretability and accuracy. Several research opportunities are identified, including new developments in machine learning that could reduce computing requirements and improve interpretability. Throughout the review, a consistent example of reservoir operation policies in the Colorado River Basin illustrates the methods.
{"title":"Taxonomy of purposes, methods, and recommendations for vulnerability analysis","authors":"Nathan Bonham , Joseph Kasprzyk , Edith Zagona","doi":"10.1016/j.envsoft.2024.106269","DOIUrl":"10.1016/j.envsoft.2024.106269","url":null,"abstract":"<div><div>Vulnerability analysis is an emerging technique that discovers concise descriptions of the conditions that lead to decision-relevant outcomes (i.e., scenarios) by applying machine learning methods to a large ensemble of simulation model runs. This review organizes vulnerability analysis methods into a taxonomy and compares them in terms of interpretability, flexibility, and accuracy. Our review contextualizes interpretability in terms of five purposes for vulnerability analysis, such as adaptation systems and choosing between policies. We make recommendations for designing a vulnerability analysis that is interpretable for a specific purpose. Furthermore, a numerical experiment demonstrates how methods can be compared based on interpretability and accuracy. Several research opportunities are identified, including new developments in machine learning that could reduce computing requirements and improve interpretability. Throughout the review, a consistent example of reservoir operation policies in the Colorado River Basin illustrates the methods.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106269"},"PeriodicalIF":4.8,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142654734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-10DOI: 10.1016/j.envsoft.2024.106262
Chenyu Song , Jingyuan Cui , Yafei Cui , Sheng Zhang , Chang Wu , Xiaoyan Qin , Qiaofeng Wu , Shanqing Chi , Mingqing Yang , Jia Liu , Ruihong Chen , Haiping Zhang
Online hydrological and water quality monitoring data has become increasingly crucial for water environment management such as assessment and modeling. However, online monitoring data often contains erroneous or incomplete datasets, consequently affecting its operational use. In the study, we developed an automated data cleaning algorithm grounded in Seasonal-Trend decomposition using Loess (STL) and Density-Based Spatial Clustering of Applications with Noise (DBSCAN). STL identifies and corrects more obvious anomalies in the time series, followed by DBSCAN for further refinement, in which the reverse nearest neighbor method was employed to enhance the clustering accuracy. To improve anomaly detection, a two-level residual judgment threshold was applied. The algorithm has been successfully applied to three reservoirs in Shanghai, China, achieving the precision rate of 0.91 and recall rate of 0.81 for dissolved oxygen and pH. The proposed algorithm can be potentially applied for cleaning of environment monitoring data with high accuracy and stability.
{"title":"Integrated STL-DBSCAN algorithm for online hydrological and water quality monitoring data cleaning","authors":"Chenyu Song , Jingyuan Cui , Yafei Cui , Sheng Zhang , Chang Wu , Xiaoyan Qin , Qiaofeng Wu , Shanqing Chi , Mingqing Yang , Jia Liu , Ruihong Chen , Haiping Zhang","doi":"10.1016/j.envsoft.2024.106262","DOIUrl":"10.1016/j.envsoft.2024.106262","url":null,"abstract":"<div><div>Online hydrological and water quality monitoring data has become increasingly crucial for water environment management such as assessment and modeling. However, online monitoring data often contains erroneous or incomplete datasets, consequently affecting its operational use. In the study, we developed an automated data cleaning algorithm grounded in Seasonal-Trend decomposition using Loess (STL) and Density-Based Spatial Clustering of Applications with Noise (DBSCAN). STL identifies and corrects more obvious anomalies in the time series, followed by DBSCAN for further refinement, in which the reverse nearest neighbor method was employed to enhance the clustering accuracy. To improve anomaly detection, a two-level residual judgment threshold was applied. The algorithm has been successfully applied to three reservoirs in Shanghai, China, achieving the precision rate of 0.91 and recall rate of 0.81 for dissolved oxygen and pH. The proposed algorithm can be potentially applied for cleaning of environment monitoring data with high accuracy and stability.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106262"},"PeriodicalIF":4.8,"publicationDate":"2024-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142655282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Coastal science has entered a new era of data-driven research, facilitated by satellite data and cloud computing. Despite its potential, the coastal community has yet to fully capitalize on these advancements due to a lack of tailored data, tools, and models. This paper demonstrates how cloud technology can advance coastal analytics at scale. We introduce GCTS, a novel foundational dataset comprising over 11 million coastal transects at 100-m resolution. Our experiments highlight the importance of cloud-optimized data formats, geospatial sorting, and metadata-driven data retrieval. By leveraging cloud technology, we achieve up to 700 times faster performance for tasks like coastal waterline mapping. A case study reveals that 33% of the world’s first kilometer of coast is below 5 m, with the entire analysis completed in a few hours. Our findings make a compelling case for the coastal community to start producing data, tools, and models suitable for scalable coastal analytics.
{"title":"Enabling coastal analytics at planetary scale","authors":"Floris Reinier Calkoen , Arjen Pieter Luijendijk , Kilian Vos , Etiënne Kras , Fedor Baart","doi":"10.1016/j.envsoft.2024.106257","DOIUrl":"10.1016/j.envsoft.2024.106257","url":null,"abstract":"<div><div>Coastal science has entered a new era of data-driven research, facilitated by satellite data and cloud computing. Despite its potential, the coastal community has yet to fully capitalize on these advancements due to a lack of tailored data, tools, and models. This paper demonstrates how cloud technology can advance coastal analytics at scale. We introduce GCTS, a novel foundational dataset comprising over 11 million coastal transects at 100-m resolution. Our experiments highlight the importance of cloud-optimized data formats, geospatial sorting, and metadata-driven data retrieval. By leveraging cloud technology, we achieve up to 700 times faster performance for tasks like coastal waterline mapping. A case study reveals that 33% of the world’s first kilometer of coast is below 5 m, with the entire analysis completed in a few hours. Our findings make a compelling case for the coastal community to start producing data, tools, and models suitable for scalable coastal analytics.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106257"},"PeriodicalIF":4.8,"publicationDate":"2024-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142672812","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-08DOI: 10.1016/j.envsoft.2024.106261
Bangjie Fu , Yange Li , Chen Wang , Zheng Han , Nan Jiang , Wendu Xie , Changli Li , Haohui Ding , Weidong Wang , Guangqi Chen
Up-to-date studies have proved the effectiveness of Convolutional Neural Networks (CNN) in landslide detection. With the rapid development of Remote Sensing and Geographic Information System technologies, an increasing amount of spectral and non-spectral information is available for CNN modeling. It offering a comprehensive perspective for landslide detection, but also presents challenges to CNNs, especially in efficiently learning long-range feature associations. Therefore, we proposed a novel Transformer-improved VGG network (Trans-VGG). It takes spectral (RGB images) and non-spectral information (elevation, slope, and PCA components) as data inputs and integrating both local and global feature in modeling. The method is tested in two landslide cluster areas in Litang County, China. The results in site a show that the Trans-VGG model demonstrates an improvement in F1-score, ranging from 4% to 21%, compared with the conventional machine learning and CNN models. The validation result in site b further proved the validity of our proposed method.
{"title":"Transformer-embedded 1D VGG convolutional neural network for regional landslides detection boosted by multichannel data inputs","authors":"Bangjie Fu , Yange Li , Chen Wang , Zheng Han , Nan Jiang , Wendu Xie , Changli Li , Haohui Ding , Weidong Wang , Guangqi Chen","doi":"10.1016/j.envsoft.2024.106261","DOIUrl":"10.1016/j.envsoft.2024.106261","url":null,"abstract":"<div><div>Up-to-date studies have proved the effectiveness of Convolutional Neural Networks (CNN) in landslide detection. With the rapid development of Remote Sensing and Geographic Information System technologies, an increasing amount of spectral and non-spectral information is available for CNN modeling. It offering a comprehensive perspective for landslide detection, but also presents challenges to CNNs, especially in efficiently learning long-range feature associations. Therefore, we proposed a novel Transformer-improved VGG network (Trans-VGG). It takes spectral (RGB images) and non-spectral information (elevation, slope, and PCA components) as data inputs and integrating both local and global feature in modeling. The method is tested in two landslide cluster areas in Litang County, China. The results in site a show that the Trans-VGG model demonstrates an improvement in F1-score, ranging from 4% to 21%, compared with the conventional machine learning and CNN models. The validation result in site b further proved the validity of our proposed method.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106261"},"PeriodicalIF":4.8,"publicationDate":"2024-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142654737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-06DOI: 10.1016/j.envsoft.2024.106253
Xin Tong , Bryan Quaife
Data-driven techniques are increasingly being applied to complement physics-based models in fire science. However, the lack of sufficiently large datasets continues to hinder the application of certain machine learning techniques. In this paper, we use simulated data to investigate the ability of neural networks to parameterize dynamics in fire science. In particular, we investigate neural networks that map five key parameters in fire spread to the first arrival time, and the corresponding inverse problem. By using simulated data, we are able to characterize the error, the required dataset size, and the convergence properties of these neural networks. For the inverse problem, we quantify the network’s sensitivity in estimating each of the key parameters. The findings demonstrate the potential of machine learning in fire science, highlight the challenges associated with limited dataset sizes, and quantify the sensitivity of neural networks to estimate key parameters governing fire spread dynamics.
{"title":"Data-driven fire modeling: Learning first arrival times and model parameters with neural networks","authors":"Xin Tong , Bryan Quaife","doi":"10.1016/j.envsoft.2024.106253","DOIUrl":"10.1016/j.envsoft.2024.106253","url":null,"abstract":"<div><div>Data-driven techniques are increasingly being applied to complement physics-based models in fire science. However, the lack of sufficiently large datasets continues to hinder the application of certain machine learning techniques. In this paper, we use simulated data to investigate the ability of neural networks to parameterize dynamics in fire science. In particular, we investigate neural networks that map five key parameters in fire spread to the first arrival time, and the corresponding inverse problem. By using simulated data, we are able to characterize the error, the required dataset size, and the convergence properties of these neural networks. For the inverse problem, we quantify the network’s sensitivity in estimating each of the key parameters. The findings demonstrate the potential of machine learning in fire science, highlight the challenges associated with limited dataset sizes, and quantify the sensitivity of neural networks to estimate key parameters governing fire spread dynamics.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106253"},"PeriodicalIF":4.8,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142654738","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-04DOI: 10.1016/j.envsoft.2024.106260
Bao Liu , Siqi Chen , Lei Gao
Understanding spatiotemporal variations in forest cover is crucial for effective forest resource management. However, existing models often lack accuracy in simultaneously capturing temporal continuity and spatial correlation. To address this challenge, we developed ResConvLSTM-Att, a novel hybrid model integrating residual neural networks, Convolutional Long Short-Term Memory (ConvLSTM) networks, and attention mechanisms. We evaluated ResConvLSTM-Att against four deep learning models: LSTM, combined convolutional neural network and LSTM (CNN-LSTM), ConvLSTM, and ResConvLSTM for spatiotemporal prediction of forest cover in Tasmania, Australia. ResConvLSTM-Att achieved outstanding prediction performance, with an average root mean square error (RMSE) of 6.9% coverage and an impressive average coefficient of determination of 0.965. Compared with LSTM, CNN-LSTM, ConvLSTM, and ResConvLSTM, ResConvLSTM-Att achieved RMSE reductions of 31.2%, 43.0%, 10.1%, and 6.5%, respectively. Additionally, we quantified the impacts of explanatory variables on forest cover dynamics. Our work demonstrated the effectiveness of ResConvLSTM-Att in spatiotemporal data modelling and prediction.
{"title":"Combining residual convolutional LSTM with attention mechanisms for spatiotemporal forest cover prediction","authors":"Bao Liu , Siqi Chen , Lei Gao","doi":"10.1016/j.envsoft.2024.106260","DOIUrl":"10.1016/j.envsoft.2024.106260","url":null,"abstract":"<div><div>Understanding spatiotemporal variations in forest cover is crucial for effective forest resource management. However, existing models often lack accuracy in simultaneously capturing temporal continuity and spatial correlation. To address this challenge, we developed ResConvLSTM-Att, a novel hybrid model integrating residual neural networks, Convolutional Long Short-Term Memory (ConvLSTM) networks, and attention mechanisms. We evaluated ResConvLSTM-Att against four deep learning models: LSTM, combined convolutional neural network and LSTM (CNN-LSTM), ConvLSTM, and ResConvLSTM for spatiotemporal prediction of forest cover in Tasmania, Australia. ResConvLSTM-Att achieved outstanding prediction performance, with an average root mean square error (RMSE) of 6.9% coverage and an impressive average coefficient of determination of 0.965. Compared with LSTM, CNN-LSTM, ConvLSTM, and ResConvLSTM, ResConvLSTM-Att achieved RMSE reductions of 31.2%, 43.0%, 10.1%, and 6.5%, respectively. Additionally, we quantified the impacts of explanatory variables on forest cover dynamics. Our work demonstrated the effectiveness of ResConvLSTM-Att in spatiotemporal data modelling and prediction.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106260"},"PeriodicalIF":4.8,"publicationDate":"2024-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142654735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-03DOI: 10.1016/j.envsoft.2024.106255
Elisa Bayraktarov , Samantha Low-Choy , Abhimanyu Raj Singh , Linda J. Beaumont , Kristen J. Williams , John B. Baumgartner , Shawn W. Laffan , Daniela Vasco , Robert Cosgrove , Jenna Wraith , Jessica Fenker Antunes , Brendan Mackey
Biodiversity decline and climate change are among the most important environmental issues society faces. Information to address these issues has benefited from increasing big data, advances in cloud computing, and subsequent new tools for analytics. Accessing such tools is streamlined by virtual laboratories for ecological analysis, like the ‘Biodiversity and Climate Change Virtual Laboratory’ (BCCVL) and ‘ecocloud’. These platforms help reduce time and effort spent on developing programming skills, data acquisition and curation, plus model building. Recently this functionality was extended, producing EcoCommons Australia—a web-based ecological modeling platform for environmental problem-solving—with upgraded infrastructure and improved ensemble modeling, post-model analysis, workflow transparency and reproducibility. We outline our user-centered approach to systems design, from initial surveys of stakeholder needs to user involvement in testing, and collaboration with specialists. We illustrate EcoCommons and compare model evaluation statistics through four case studies, highlighting how the modular platform meets users' needs.
{"title":"EcoCommons Australia virtual laboratories with cloud computing: Meeting diverse user needs for ecological modeling and decision-making","authors":"Elisa Bayraktarov , Samantha Low-Choy , Abhimanyu Raj Singh , Linda J. Beaumont , Kristen J. Williams , John B. Baumgartner , Shawn W. Laffan , Daniela Vasco , Robert Cosgrove , Jenna Wraith , Jessica Fenker Antunes , Brendan Mackey","doi":"10.1016/j.envsoft.2024.106255","DOIUrl":"10.1016/j.envsoft.2024.106255","url":null,"abstract":"<div><div>Biodiversity decline and climate change are among the most important environmental issues society faces. Information to address these issues has benefited from increasing big data, advances in cloud computing, and subsequent new tools for analytics. Accessing such tools is streamlined by virtual laboratories for ecological analysis, like the ‘Biodiversity and Climate Change Virtual Laboratory’ (BCCVL) and ‘ecocloud’. These platforms help reduce time and effort spent on developing programming skills, data acquisition and curation, plus model building. Recently this functionality was extended, producing EcoCommons Australia—a web-based ecological modeling platform for environmental problem-solving—with upgraded infrastructure and improved ensemble modeling, post-model analysis, workflow transparency and reproducibility. We outline our user-centered approach to systems design, from initial surveys of stakeholder needs to user involvement in testing, and collaboration with specialists. We illustrate EcoCommons and compare model evaluation statistics through four case studies, highlighting how the modular platform meets users' needs.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106255"},"PeriodicalIF":4.8,"publicationDate":"2024-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142655281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-02DOI: 10.1016/j.envsoft.2024.106254
Nicolò Perello , Andrea Trucchia , Mirko D’Andrea , Silvia Degli Esposti , Paolo Fiorucci , Andrea Gollini , Dario Negro
Estimating the Dead Fuel Moisture Content (DFMC) is crucial in wildfire risk management, representing a key component in forest fire danger rating systems and wildfire simulation models. DFMC fluctuates sub-daily and spatially, influenced by local weather and fuel characteristics. This necessitates models that provide sub-daily fuel moisture conditions for improving wildfire risk management. Many forest fire danger rating systems typically rely on daily fuel moisture models that overlook local fuel characteristics, with consequent impact on wildfire management. The semi-empirical parametric DFMC model proposed addresses these issues by providing hourly dead fuel moisture dynamics, with specific parameters to consider local fuel characteristics. A calibration framework is proposed by adopting Particle Swarm Optimization-type algorithm. In the present study, the calibration framework has been tested by using hourly 10-h fuel sticks measurements. Implementing this model in forest fire danger rating systems would enhance detail in forest fire danger conditions, advancing wildfire risk management.
{"title":"An adaptable dead fuel moisture model for various fuel types and temporal scales tailored for wildfire danger assessment","authors":"Nicolò Perello , Andrea Trucchia , Mirko D’Andrea , Silvia Degli Esposti , Paolo Fiorucci , Andrea Gollini , Dario Negro","doi":"10.1016/j.envsoft.2024.106254","DOIUrl":"10.1016/j.envsoft.2024.106254","url":null,"abstract":"<div><div>Estimating the Dead Fuel Moisture Content (DFMC) is crucial in wildfire risk management, representing a key component in forest fire danger rating systems and wildfire simulation models. DFMC fluctuates sub-daily and spatially, influenced by local weather and fuel characteristics. This necessitates models that provide sub-daily fuel moisture conditions for improving wildfire risk management. Many forest fire danger rating systems typically rely on daily fuel moisture models that overlook local fuel characteristics, with consequent impact on wildfire management. The semi-empirical parametric DFMC model proposed addresses these issues by providing hourly dead fuel moisture dynamics, with specific parameters to consider local fuel characteristics. A calibration framework is proposed by adopting Particle Swarm Optimization-type algorithm. In the present study, the calibration framework has been tested by using hourly 10-h fuel sticks measurements. Implementing this model in forest fire danger rating systems would enhance detail in forest fire danger conditions, advancing wildfire risk management.</div></div>","PeriodicalId":310,"journal":{"name":"Environmental Modelling & Software","volume":"183 ","pages":"Article 106254"},"PeriodicalIF":4.8,"publicationDate":"2024-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142654736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}