Pub Date : 2024-06-25DOI: 10.1016/j.acags.2024.100175
Nyigam Bole, Arnab Bandyopadhyay, Aditi Bhadra
This paper documents the development of PixelSWAT, a Graphical User interface (GUI) python toolbox developed with the motive of creating gridded watershed and stream features to run the Soil and Water Assessment Tool (SWAT) in a distributed discretization scheme thus allowing optimum utilization of gridded weather datasets. Additionally, the tool also aims to automate the preparation of SWAT weather input files from Network Common Data (NetCDF) files for any SWAT user along with the option to interpolate the weather files for each grid. A case study was conducted in the Mago basin of Tawang, Arunachal Pradesh, using gridded weather datasets for hydrological simulation. Three SWAT models were prepared – a conventional SWAT model; a 500 m and a 1000 m gridded watershed PixelSWAT models. Statistical indices Nash Sutcliffe (NSE), Coefficient of Determination (R2) and Percent Bias (PBIAS) showed that the PixelSWAT projects performed marginally better than the conventional model and also incorporated the weather data more meaningfully.
{"title":"PixelSWAT: A user-friendly ArcGIS tool for preparing inputs to run SWAT in a distributed discretization scheme","authors":"Nyigam Bole, Arnab Bandyopadhyay, Aditi Bhadra","doi":"10.1016/j.acags.2024.100175","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100175","url":null,"abstract":"<div><p>This paper documents the development of PixelSWAT, a Graphical User interface (GUI) python toolbox developed with the motive of creating gridded watershed and stream features to run the Soil and Water Assessment Tool (SWAT) in a distributed discretization scheme thus allowing optimum utilization of gridded weather datasets. Additionally, the tool also aims to automate the preparation of SWAT weather input files from Network Common Data (NetCDF) files for any SWAT user along with the option to interpolate the weather files for each grid. A case study was conducted in the Mago basin of Tawang, Arunachal Pradesh, using gridded weather datasets for hydrological simulation. Three SWAT models were prepared – a conventional SWAT model; a 500 m and a 1000 m gridded watershed PixelSWAT models. Statistical indices Nash Sutcliffe (NSE), Coefficient of Determination (R<sup>2</sup>) and Percent Bias (PBIAS) showed that the PixelSWAT projects performed marginally better than the conventional model and also incorporated the weather data more meaningfully.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"23 ","pages":"Article 100175"},"PeriodicalIF":2.6,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000223/pdfft?md5=015d448ba5469b5c76c4bfae001af77e&pid=1-s2.0-S2590197424000223-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141485088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-18DOI: 10.1016/j.acags.2024.100172
Gustavo Solcia, Bernd U. Foerster, Mariane B. Andreeta, Tito J. Bonagamba, Fernando F. Paiva
Computational fluid dynamics (CFD) is an essential tool with growing applications in many fields. In petrophysics, it is common to use computed tomography in those simulations, but in medicine, magnetic resonance imaging (MRI) is also being used as a basis for structural information. Wormholes are high-permeability structures created by the acidification of carbonate reservoirs and can impact reservoir production. CFD combined with MRI can benefit the study of wormholes in petrophysics, but combining both techniques is still a challenge. The objective of this study is to develop a pipeline for performing CFD in wormholes with MRI data. Using three samples of carbonate rocks acidified with 1.5% hydrochloric acid at 0.1, 1, and 10 ml/min, we acquired resolution T2-weighted images and experimental measurements of pressure data within flow rates of 5 to 50 ml/min. We applied cropping, bias field correction, non-local means denoising, and segmentation in the image processing step. For the 3D reconstruction, we used marching cubes to generate the surface mesh, the Taubin filter for surface smoothing, and boundary modeling with Blender. Finally, for the CFD, we generated volumetric meshes with cfMesh and used the OpenFOAM simpleFoam solver to simulate an incompressible, stationary, and laminar flow. We analyzed the effect of surface smoothing, estimating edge displacements, and measured the simulation pressure at the same flow rates as the experiments. Surface smoothing had a negligible impact on the overall edge position. For most flow rates, the simulation and experimental pressure measurements matched. A possible reason for the discrepancies is that we did not consider the surrounding porous media in the simulations. In summary, our work had satisfactory results, demonstrating CFD’s feasibility in studying wormholes using MRI.
{"title":"Computational fluid dynamics in carbonate rock wormholes using magnetic resonance images as structural information","authors":"Gustavo Solcia, Bernd U. Foerster, Mariane B. Andreeta, Tito J. Bonagamba, Fernando F. Paiva","doi":"10.1016/j.acags.2024.100172","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100172","url":null,"abstract":"<div><p>Computational fluid dynamics (CFD) is an essential tool with growing applications in many fields. In petrophysics, it is common to use computed tomography in those simulations, but in medicine, magnetic resonance imaging (MRI) is also being used as a basis for structural information. Wormholes are high-permeability structures created by the acidification of carbonate reservoirs and can impact reservoir production. CFD combined with MRI can benefit the study of wormholes in petrophysics, but combining both techniques is still a challenge. The objective of this study is to develop a pipeline for performing CFD in wormholes with MRI data. Using three samples of carbonate rocks acidified with 1.5% hydrochloric acid at 0.1, 1, and 10 ml/min, we acquired <span><math><mrow><mn>300</mn><mspace></mspace><mi>μ</mi><mi>m</mi></mrow></math></span> resolution T2-weighted images and experimental measurements of pressure data within flow rates of 5 to 50 ml/min. We applied cropping, bias field correction, non-local means denoising, and segmentation in the image processing step. For the 3D reconstruction, we used marching cubes to generate the surface mesh, the Taubin filter for surface smoothing, and boundary modeling with Blender. Finally, for the CFD, we generated volumetric meshes with cfMesh and used the OpenFOAM simpleFoam solver to simulate an incompressible, stationary, and laminar flow. We analyzed the effect of surface smoothing, estimating edge displacements, and measured the simulation pressure at the same flow rates as the experiments. Surface smoothing had a negligible impact on the overall edge position. For most flow rates, the simulation and experimental pressure measurements matched. A possible reason for the discrepancies is that we did not consider the surrounding porous media in the simulations. In summary, our work had satisfactory results, demonstrating CFD’s feasibility in studying wormholes using MRI.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"23 ","pages":"Article 100172"},"PeriodicalIF":2.6,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000193/pdfft?md5=059cc5ef82a79ca57be4db288a9600db&pid=1-s2.0-S2590197424000193-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141444287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As critical transitional ecosystems, estuaries are facing the increasingly urgent threat of salt wedge intrusion, which impacts their ecological balance as well as human-dependent activities. Accurately predicting estuary salinity is essential for water resource management, ecosystem preservation, and for ensuring sustainable development along coastlines. In this study, we investigated the application of different machine learning and deep learning models to predict salinity levels within estuarine environments. Leveraging different techniques, including Random Forest, Least-Squares Boosting, Artificial Neural Network and Long Short-Term Memory networks, the aim was to enhance the predictive accuracy in order to better understand the complex interplay of factors influencing estuarine salinity dynamics. The Po River estuary (Po di Goro), which is one of the main hotspots of salt wedge intrusion, was selected as the study area. Comparative analyses of machine learning models with the state-of-the-art physics-based Estuary box model (EBM) and Hybrid-EBM models were conducted to assess model performances. The results highlighted an improvement in the machine learning performance, with a reduction in the RMSE (from 4.22 psu obtained by physics-based EBM to 2.80 psu obtained by LSBoost-Season) and an increase in the score (from 0.67 obtained by physics-based EBM to 0.85 by LSBoost-Season), computed on the test set. We also explored the impact of different variables and their contributions to the predictive capabilities of the models. Overall, this study demonstrates the feasibility and effectiveness of ML-based approaches for estimating salinity levels due to salt wedge intrusion within estuaries. The insights obtained from this study could significantly support smart management strategies, not only in the Po River estuary, but also in other location.
{"title":"Enhancing estuary salinity prediction: A Machine Learning and Deep Learning based approach","authors":"Leonardo Saccotelli , Giorgia Verri , Alessandro De Lorenzis , Carla Cherubini , Rocco Caccioppoli , Giovanni Coppini , Rosalia Maglietta","doi":"10.1016/j.acags.2024.100173","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100173","url":null,"abstract":"<div><p>As critical transitional ecosystems, estuaries are facing the increasingly urgent threat of salt wedge intrusion, which impacts their ecological balance as well as human-dependent activities. Accurately predicting estuary salinity is essential for water resource management, ecosystem preservation, and for ensuring sustainable development along coastlines. In this study, we investigated the application of different machine learning and deep learning models to predict salinity levels within estuarine environments. Leveraging different techniques, including Random Forest, Least-Squares Boosting, Artificial Neural Network and Long Short-Term Memory networks, the aim was to enhance the predictive accuracy in order to better understand the complex interplay of factors influencing estuarine salinity dynamics. The Po River estuary (Po di Goro), which is one of the main hotspots of salt wedge intrusion, was selected as the study area. Comparative analyses of machine learning models with the state-of-the-art physics-based Estuary box model (EBM) and Hybrid-EBM models were conducted to assess model performances. The results highlighted an improvement in the machine learning performance, with a reduction in the RMSE (from 4.22 psu obtained by physics-based EBM to 2.80 psu obtained by LSBoost-Season) and an increase in the <span><math><msup><mrow><mi>R</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span> score (from 0.67 obtained by physics-based EBM to 0.85 by LSBoost-Season), computed on the test set. We also explored the impact of different variables and their contributions to the predictive capabilities of the models. Overall, this study demonstrates the feasibility and effectiveness of ML-based approaches for estimating salinity levels due to salt wedge intrusion within estuaries. The insights obtained from this study could significantly support smart management strategies, not only in the Po River estuary, but also in other location.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"23 ","pages":"Article 100173"},"PeriodicalIF":2.6,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S259019742400020X/pdfft?md5=e1745be59526ad5e06d93bdaa293674d&pid=1-s2.0-S259019742400020X-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141485087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-01DOI: 10.1016/j.acags.2024.100167
Armita Davarpanah , Hassan A. Babaie , W. Crawford Elliott
Critical minerals are increasingly used in advanced, modern technologies. Exploration for these minerals require efficient mechanisms to search for the latest geological knowledge about the petrogenesis and spatial distribution of these essential resources. Although the current text-based deposit classification schemes help geoscientists to understand how and where these critical minerals form, they cannot easily be queried by software without extensive natural language processing and knowledge modeling. Ontologies can explicitly specify the knowledge scattered in the texts and tables of these schemes and the Critical Minerals Mapping Initiative (CMMI) database by way of logical structures whose results can automatically be processed and queried. They can also draw new knowledge by inference from the ones that are explicitly specified in them. These qualities make ontologies a perfect choice for digital knowledge storage, search, and extraction. The Critical Minerals Ontology (CMO) is described herein by reusing the logical class and property structures of the top-level Basic Formal Ontology (BFO) and mid-level Common Core Ontologies (CCO) and Relation Ontology (RO). The CMO formally models the knowledge about the critical mineral systems using the latest deposit classification scheme and the CMMI database schema. The ontology specifies the geochemical and geological processes that operate in various geotectonic environments of mineral systems to form the critical minerals in different deposit types. It models the properties of both the host minerals that contain the rare-earth elements and those that bear other types of elements. The CMO also represents uses of specific critical minerals in the manufacturing of industrial products, their alternate substitutes, and countries that produce, import, and export them. A query system, applying the Python programming language, accesses the knowledge modeled in the CMO and allows users through interactive web pages to query the ontology and extract different types of information from it. The ontology and the query system are useful for research in ore mineralogy and critical mineral prospecting. The information modeled by the ontology and served by the query system allows users to classify their ore specimen data into specific deposit types.
{"title":"Knowledge-based query system for the critical minerals","authors":"Armita Davarpanah , Hassan A. Babaie , W. Crawford Elliott","doi":"10.1016/j.acags.2024.100167","DOIUrl":"10.1016/j.acags.2024.100167","url":null,"abstract":"<div><p>Critical minerals are increasingly used in advanced, modern technologies. Exploration for these minerals require efficient mechanisms to search for the latest geological knowledge about the petrogenesis and spatial distribution of these essential resources. Although the current text-based deposit classification schemes help geoscientists to understand how and where these critical minerals form, they cannot easily be queried by software without extensive natural language processing and knowledge modeling. Ontologies can explicitly specify the knowledge scattered in the texts and tables of these schemes and the Critical Minerals Mapping Initiative (CMMI) database by way of logical structures whose results can automatically be processed and queried. They can also draw new knowledge by inference from the ones that are explicitly specified in them. These qualities make ontologies a perfect choice for digital knowledge storage, search, and extraction. The Critical Minerals Ontology (CMO) is described herein by reusing the logical class and property structures of the top-level Basic Formal Ontology (BFO) and mid-level Common Core Ontologies (CCO) and Relation Ontology (RO). The CMO formally models the knowledge about the critical mineral systems using the latest deposit classification scheme and the CMMI database schema. The ontology specifies the geochemical and geological processes that operate in various geotectonic environments of mineral systems to form the critical minerals in different deposit types. It models the properties of both the host minerals that contain the rare-earth elements and those that bear other types of elements. The CMO also represents uses of specific critical minerals in the manufacturing of industrial products, their alternate substitutes, and countries that produce, import, and export them. A query system, applying the Python programming language, accesses the knowledge modeled in the CMO and allows users through interactive web pages to query the ontology and extract different types of information from it. The ontology and the query system are useful for research in ore mineralogy and critical mineral prospecting. The information modeled by the ontology and served by the query system allows users to classify their ore specimen data into specific deposit types.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"22 ","pages":"Article 100167"},"PeriodicalIF":3.4,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000144/pdfft?md5=2d9e8afd7172f753322344e24e6d8d5b&pid=1-s2.0-S2590197424000144-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141144553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-31DOI: 10.1016/j.acags.2024.100168
A. Mena, L.M. Fernández-Salas
The present research paper addresses a critical gap in existing literature concerning the absence of a standardized methodology for parameter selection in the computation of the Bathymetric Position Index (BPI) values. The BPI is a measure of where a georeferenced location, with a defined depth, is relative to the neighbouring seascape, and it plays a significant role in characterizing benthic terrain for modelling and classification. Arguably, the two most important parameters when calculating the BPI are the size and the shape of the neighbourhood of analysis. With regards to the radius parameter, which defines the size of the neighbourhood, the optimal radius value for calculating the BPI must be carefully chosen, considering both the size of the target morphology and the scale factor, which is equal to the radius in map units multiplied by the cell size. It is recommended that the optimal radius value should closely match the size of the target morphology. Tests were performed using an annular neighbourhood shape and they have revealed that the outer radius is the most influential factor in the BPI calculation. Further experimentations and comparisons between circular and annular shapes have indicated that the use of different shapes has no significant impact on the results. The study has found no substantial correlation between the BPI values and other examined terrain variables, such as depth, slope, and curvature. This lack of correlation may be attributed to the BPI values accounting for the specific neighbourhood size, while for the studied variables the default window size was used, which is a considerably smaller scale than the ones used in most BPI calculations. In conclusion, this research highlights the importance of parameter selection in BPI calculations and provides valuable insights into the optimal radius choice and the negligible impact of neighbourhood shape. The findings also shed light on the unique nature of BPI values and their relationship with other geospatial variables.
{"title":"Optimizing bathymetric position index (BPI) calculation: An analysis of parameters and recommendations for the selection of their optimal values","authors":"A. Mena, L.M. Fernández-Salas","doi":"10.1016/j.acags.2024.100168","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100168","url":null,"abstract":"<div><p>The present research paper addresses a critical gap in existing literature concerning the absence of a standardized methodology for parameter selection in the computation of the Bathymetric Position Index (BPI) values. The BPI is a measure of where a georeferenced location, with a defined depth, is relative to the neighbouring seascape, and it plays a significant role in characterizing benthic terrain for modelling and classification. Arguably, the two most important parameters when calculating the BPI are the size and the shape of the neighbourhood of analysis. With regards to the radius parameter, which defines the size of the neighbourhood, the optimal radius value for calculating the BPI must be carefully chosen, considering both the size of the target morphology and the scale factor, which is equal to the radius in map units multiplied by the cell size. It is recommended that the optimal radius value should closely match the size of the target morphology. Tests were performed using an annular neighbourhood shape and they have revealed that the outer radius is the most influential factor in the BPI calculation. Further experimentations and comparisons between circular and annular shapes have indicated that the use of different shapes has no significant impact on the results. The study has found no substantial correlation between the BPI values and other examined terrain variables, such as depth, slope, and curvature. This lack of correlation may be attributed to the BPI values accounting for the specific neighbourhood size, while for the studied variables the default window size was used, which is a considerably smaller scale than the ones used in most BPI calculations. In conclusion, this research highlights the importance of parameter selection in BPI calculations and provides valuable insights into the optimal radius choice and the negligible impact of neighbourhood shape. The findings also shed light on the unique nature of BPI values and their relationship with other geospatial variables.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"23 ","pages":"Article 100168"},"PeriodicalIF":3.4,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000156/pdfft?md5=0286e8b9c27e2b3079a179596aba29da&pid=1-s2.0-S2590197424000156-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141286448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-21DOI: 10.1016/j.acags.2024.100166
M. Starnoni, M.A. Dawi, X. Sanchez-Vila
This paper provides a new open-source software, named BioReactPy, for simulation of microbial-mediated coupled processes of flow and reactive transport in porous media. The software is based on the micro-continuum approach, and geochemistry is handled in a fully coupled manner with biomass-nutrient growth treated with Monod equation in a single integrated framework, without dependencies on third party packages. The distinguishing features of the software, its design principles, and formulation of multiphysics problems and discretizations are discussed. Validation of the Python implementation using several established benchmarks for flow, reactive transport, and biomass growth is presented. The flexibility of the framework is then illustrated by simulations of highly non-linearly coupled flow and microbial reactive transport at conditions relevant to carbon mineralization for CO storage. All results can be reproduced by openly available simulation scripts.
{"title":"BioReactPy: An open-source software for simulation of microbial-mediated reactive processes in porous media","authors":"M. Starnoni, M.A. Dawi, X. Sanchez-Vila","doi":"10.1016/j.acags.2024.100166","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100166","url":null,"abstract":"<div><p>This paper provides a new open-source software, named BioReactPy, for simulation of microbial-mediated coupled processes of flow and reactive transport in porous media. The software is based on the micro-continuum approach, and geochemistry is handled in a fully coupled manner with biomass-nutrient growth treated with Monod equation in a single integrated framework, without dependencies on third party packages. The distinguishing features of the software, its design principles, and formulation of multiphysics problems and discretizations are discussed. Validation of the <em>Python</em> implementation using several established benchmarks for flow, reactive transport, and biomass growth is presented. The flexibility of the framework is then illustrated by simulations of highly non-linearly coupled flow and microbial reactive transport at conditions relevant to carbon mineralization for CO<span><math><msub><mrow></mrow><mrow><mn>2</mn></mrow></msub></math></span> storage. All results can be reproduced by openly available simulation scripts.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"22 ","pages":"Article 100166"},"PeriodicalIF":3.4,"publicationDate":"2024-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000132/pdfft?md5=0356eeaf365220fb8e4e4b0812a35643&pid=1-s2.0-S2590197424000132-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140644289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-17DOI: 10.1016/j.acags.2024.100165
Liqun Shan , Chengqian Liu , Yanchang Liu , Yazhou Tu , Sai Venkatesh Chilukoti , Xiali Hei
Micro-CT, also known as X-ray micro-computed tomography, has emerged as the primary instrument for pore-scale properties study in geological materials. Several studies have used deep learning to achieve super-resolution reconstruction in order to balance the trade-off between resolution of CT images and field of view. Nevertheless, most existing methods only work with single-scale CT scans, ignoring the possibility of using multi-scale image features for image reconstruction. In this study, we proposed a super-resolution approach via multi-scale fusion using residual U-Net for rock micro-CT image reconstruction (MS-ResUnet). The residual U-Net provides an encoder-decoder structure. In each encoder layer, several residual sequential blocks and improved residual blocks are used. The decoder is composed of convolutional ReLU residual blocks and residual chained pooling blocks. During the encoding-decoding method, information transfers between neighboring multi-resolution images are fused, resulting in richer rock characteristic information. Qualitative and quantitative comparisons of sandstone, carbonate, and coal CT images demonstrate that our proposed algorithm surpasses existing approaches. Our model accurately reconstructed the intricate details of pores in carbonate and sandstone, as well as clearly visible coal cracks.
{"title":"Single image multi-scale enhancement for rock Micro-CT super-resolution using residual U-Net","authors":"Liqun Shan , Chengqian Liu , Yanchang Liu , Yazhou Tu , Sai Venkatesh Chilukoti , Xiali Hei","doi":"10.1016/j.acags.2024.100165","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100165","url":null,"abstract":"<div><p>Micro-CT, also known as X-ray micro-computed tomography, has emerged as the primary instrument for pore-scale properties study in geological materials. Several studies have used deep learning to achieve super-resolution reconstruction in order to balance the trade-off between resolution of CT images and field of view. Nevertheless, most existing methods only work with single-scale CT scans, ignoring the possibility of using multi-scale image features for image reconstruction. In this study, we proposed a super-resolution approach via multi-scale fusion using residual U-Net for rock micro-CT image reconstruction (MS-ResUnet). The residual U-Net provides an encoder-decoder structure. In each encoder layer, several residual sequential blocks and improved residual blocks are used. The decoder is composed of convolutional ReLU residual blocks and residual chained pooling blocks. During the encoding-decoding method, information transfers between neighboring multi-resolution images are fused, resulting in richer rock characteristic information. Qualitative and quantitative comparisons of sandstone, carbonate, and coal CT images demonstrate that our proposed algorithm surpasses existing approaches. Our model accurately reconstructed the intricate details of pores in carbonate and sandstone, as well as clearly visible coal cracks.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"22 ","pages":"Article 100165"},"PeriodicalIF":3.4,"publicationDate":"2024-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000120/pdfft?md5=a5d1fae25e7acce0a16ad1a4c88f7058&pid=1-s2.0-S2590197424000120-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140644288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-12DOI: 10.1016/j.acags.2024.100163
Salma Ommi , Mohammad Hashemi
Studying the changes in seismicity, and the potential of the occurrences of large earthquakes in a seismic zone is not only extremely important from the aspect of seismological research, but it is additionally significant in the decisions of crisis management. Since, nowadays Machine learning techniques have proven the high ability for analyzing information, and discovering the relations among the parameters, in this research were tested some of these techniques for the earthquake prediction. For analysis, the north Zagros seismic catalogue was selected. A region that is an active seismic zone, and large cities are located there. Moreover, nine seismic parameters were used to study the possibility of large earthquake prediction for 1 month using three different Machine Learning (ML) techniques (Artificial Neural Network (ANN), Random Forest, and Support Vector Machine (SVM)). The accuracy of prediction models was evaluated using four different statistical measures (recall, accuracy, precision, and F1-score). The results showed that the (ANN) method is more accurate than other methods. Based on three investigated methodologies, greater accuracy results have been produced to forecast the earthquakes with bigger scale earthquakes about the completeness of the seismic catalogue in large magnitude. These achievements promise the possibility of successful prediction in a short period, which is hopeful for better crisis management performance.
{"title":"Machine learning technique in the north zagros earthquake prediction","authors":"Salma Ommi , Mohammad Hashemi","doi":"10.1016/j.acags.2024.100163","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100163","url":null,"abstract":"<div><p>Studying the changes in seismicity, and the potential of the occurrences of large earthquakes in a seismic zone is not only extremely important from the aspect of seismological research, but it is additionally significant in the decisions of crisis management. Since, nowadays Machine learning techniques have proven the high ability for analyzing information, and discovering the relations among the parameters, in this research were tested some of these techniques for the earthquake prediction. For analysis, the north Zagros seismic catalogue was selected. A region that is an active seismic zone, and large cities are located there. Moreover, nine seismic parameters were used to study the possibility of large earthquake prediction for 1 month using three different Machine Learning (ML) techniques (Artificial Neural Network (ANN), Random Forest, and Support Vector Machine (SVM)). The accuracy of prediction models was evaluated using four different statistical measures (recall, accuracy, precision, and F1-score). The results showed that the (ANN) method is more accurate than other methods. Based on three investigated methodologies, greater accuracy results have been produced to forecast the earthquakes with bigger scale earthquakes about the completeness of the seismic catalogue in large magnitude. These achievements promise the possibility of successful prediction in a short period, which is hopeful for better crisis management performance.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"22 ","pages":"Article 100163"},"PeriodicalIF":3.4,"publicationDate":"2024-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000107/pdfft?md5=bd218566b9d38745ae009d44255d10cd&pid=1-s2.0-S2590197424000107-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140619101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-11DOI: 10.1016/j.acags.2024.100164
Wenjia Li , Xiaogang Ma , Xinqing Wang , Liang Wu , Sanaz Salati , Zhong Xie
Rocks formed during different geologic time record the diverse evolution of the geosphere and biosphere. In the past decades, substantial geoscience data have been made open access, providing invaluable resources for studying the stratigraphy in different regions and at different scales. However, many open datasets have information recorded in natural language with heterogeneous terminologies, short of efficient approaches to analyze them. In this research, we constructed a hybrid Stratigraphic Knowledge Graph (StraKG) to help address this challenge. StraKG has two layers, a simple schema layer and a rich instance layer. For the schemas, we used a short but functional list of classes and relationships, and then incorporated community-recognized terminologies from geological dictionaries. For the instances, we used natural language processing techniques to analyze open text data and obtained massive records, such as rocks and spatial locations. The nodes in the two layers were associated to establish a consistent structure of stratigraphic knowledge. To verify the functionality of StraKG, we applied it to the Baidu encyclopedia, the largest online Chinese encyclopedia. Three experiments were implemented on the topics of stratigraphic correlation, spatial distribution of ophiolite in China, and spatio-temporal distribution of open lithostratigraphic data. The results show that StraKG can provide strong knowledge reference for stratigraphic studies. Used together with data exploration and data mining methods, StraKG illustrates a new approach to analyze the open and big text data in geoscience.
{"title":"A hybrid knowledge graph for efficient exploration of lithostratigraphic information in open text data","authors":"Wenjia Li , Xiaogang Ma , Xinqing Wang , Liang Wu , Sanaz Salati , Zhong Xie","doi":"10.1016/j.acags.2024.100164","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100164","url":null,"abstract":"<div><p>Rocks formed during different geologic time record the diverse evolution of the geosphere and biosphere. In the past decades, substantial geoscience data have been made open access, providing invaluable resources for studying the stratigraphy in different regions and at different scales. However, many open datasets have information recorded in natural language with heterogeneous terminologies, short of efficient approaches to analyze them. In this research, we constructed a hybrid Stratigraphic Knowledge Graph (StraKG) to help address this challenge. StraKG has two layers, a simple schema layer and a rich instance layer. For the schemas, we used a short but functional list of classes and relationships, and then incorporated community-recognized terminologies from geological dictionaries. For the instances, we used natural language processing techniques to analyze open text data and obtained massive records, such as rocks and spatial locations. The nodes in the two layers were associated to establish a consistent structure of stratigraphic knowledge. To verify the functionality of StraKG, we applied it to the Baidu encyclopedia, the largest online Chinese encyclopedia. Three experiments were implemented on the topics of stratigraphic correlation, spatial distribution of ophiolite in China, and spatio-temporal distribution of open lithostratigraphic data. The results show that StraKG can provide strong knowledge reference for stratigraphic studies. Used together with data exploration and data mining methods, StraKG illustrates a new approach to analyze the open and big text data in geoscience.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"22 ","pages":"Article 100164"},"PeriodicalIF":3.4,"publicationDate":"2024-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000119/pdfft?md5=f9a7de24734aba4b725f80aef417972d&pid=1-s2.0-S2590197424000119-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140558902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Currently, the oil and gas industry faces numerous challenges in addressing geosteering issues in horizontal drilling. To optimize the extraction of hydrocarbon resources and to avoid penetration in aquifers, industry experts frequently modify the drilling trajectory using real-time measurements. This approach involves quantifying subsurface uncertainties in real-time, enhancing operational decision-making with more informed insights but also adding to its complexity. This paper demonstrates an approach to decision making for trajectory correction based on real-time formation evaluation data and the differential evolution algorithm. The approach uses volumetric resistivity log data and data from reservoir models, such as porosity. The provided methodology suggests corrections for planned well trajectories by maximization of the objective function. The objective function operates with a calculated hydrocarbon saturation environment as the decision-making system in a virtual sequential drilling process. To demonstrate the accuracy and reliability of our approach, we compared the simulations of the corrected trajectory with the preliminary trajectory drilled in the same area. In addition, we conducted several experiments to tune the hyper-parameters of the differential evolution algorithm to select the optimal parameter set for our case study and compared proposed differential evolution algorithm with particle swarm optimization and pattern search algorithms. The results of our experiments showed that the real-time formation evaluation data combined with the differential evolution algorithm outperformed a trajectory provided by the drilling engineers. Differential evolution algorithm demonstrated strong performance compared to others optimization algorithms. We have implemented a complete pipeline from generating resistivity and porosity cubes, using the Archie equation to estimate oil saturation, and consequently generating a corrected trajectory in this cube based on near-well data, angle constraints and predefined hyper-parameters set prior to well trajectory planning. The methods developed were validated on synthetic and real datasets. Our decision-making system shows better cumulative oil saturation values than the preliminary provided horizontal well.
{"title":"Geosteering based on resistivity data and evolutionary optimization algorithm","authors":"Maksimilian Pavlov , Georgy Peshkov , Klemens Katterbauer , Abdallah Alshehri","doi":"10.1016/j.acags.2024.100162","DOIUrl":"https://doi.org/10.1016/j.acags.2024.100162","url":null,"abstract":"<div><p>Currently, the oil and gas industry faces numerous challenges in addressing geosteering issues in horizontal drilling. To optimize the extraction of hydrocarbon resources and to avoid penetration in aquifers, industry experts frequently modify the drilling trajectory using real-time measurements. This approach involves quantifying subsurface uncertainties in real-time, enhancing operational decision-making with more informed insights but also adding to its complexity. This paper demonstrates an approach to decision making for trajectory correction based on real-time formation evaluation data and the differential evolution algorithm. The approach uses volumetric resistivity log data and data from reservoir models, such as porosity. The provided methodology suggests corrections for planned well trajectories by maximization of the objective function. The objective function operates with a calculated hydrocarbon saturation environment as the decision-making system in a virtual sequential drilling process. To demonstrate the accuracy and reliability of our approach, we compared the simulations of the corrected trajectory with the preliminary trajectory drilled in the same area. In addition, we conducted several experiments to tune the hyper-parameters of the differential evolution algorithm to select the optimal parameter set for our case study and compared proposed differential evolution algorithm with particle swarm optimization and pattern search algorithms. The results of our experiments showed that the real-time formation evaluation data combined with the differential evolution algorithm outperformed a trajectory provided by the drilling engineers. Differential evolution algorithm demonstrated strong performance compared to others optimization algorithms. We have implemented a complete pipeline from generating resistivity and porosity cubes, using the Archie equation to estimate oil saturation, and consequently generating a corrected trajectory in this cube based on near-well data, angle constraints and predefined hyper-parameters set prior to well trajectory planning. The methods developed were validated on synthetic and real datasets. Our decision-making system shows better cumulative oil saturation values than the preliminary provided horizontal well.</p></div>","PeriodicalId":33804,"journal":{"name":"Applied Computing and Geosciences","volume":"22 ","pages":"Article 100162"},"PeriodicalIF":3.4,"publicationDate":"2024-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2590197424000090/pdfft?md5=121ad0b2564ad9df2ff5474153c7c429&pid=1-s2.0-S2590197424000090-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140320952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}