The sub-bottom profiler is a valuable tool for obtaining high-resolution shallow stratigraphic data in marine geological and geophysical surveys. To detect and acquire the structural characteristics of small submarine objects, we developed a data processing method that utilizes 2D data to construct a 3D structural model. We conducted application experiments using sub-bottom profile detection data from Chuanshan Islands, which were explored using China’s most advanced unmanned exploration platform and commercial shallow formation profiling system. To create high-resolution 3D seafloor structure models from recorded 2D sub-bottom profile datasets, an optimized data processing sequence was devised, comprising two stages: 2D data processing and 3D data processing. The 2D data processing stage involved spectrum analysis, band-pass filtering, matching filtering, time-varying gain, and surge correction. The subsequent 3D data processing stage encompassed ping location reallocation, static correction, and extraction of feature layer information. Notably, the final pseudo-3D sub-bottom profile time slice exhibited significant amplitude variations near the target body. This methodology represents an extension of the application of 2D sub-bottom profile data, enhancing the target recognition capabilities of such data. To further improve the precision of target body characterization, we used ArcScene 10.0 to create a 3D sub-bottom profile formation model spatial database. We constructed a submarine 3D formation structure model to show the 3D structural characteristics of the target body in detail and identified a seabed target body measuring 6.4 × 9.2 × 10 m.
{"title":"Application of pseudo-3D sub-bottom profile imaging technology in small submarine target detection","authors":"Tianguang Li, Zhiqing Huang, Xiaobo Zhang, Fansheng Meng, Yifan Pei, Jiali Guo","doi":"10.1007/s11600-024-01343-1","DOIUrl":"https://doi.org/10.1007/s11600-024-01343-1","url":null,"abstract":"<p>The sub-bottom profiler is a valuable tool for obtaining high-resolution shallow stratigraphic data in marine geological and geophysical surveys. To detect and acquire the structural characteristics of small submarine objects, we developed a data processing method that utilizes 2D data to construct a 3D structural model. We conducted application experiments using sub-bottom profile detection data from Chuanshan Islands, which were explored using China’s most advanced unmanned exploration platform and commercial shallow formation profiling system. To create high-resolution 3D seafloor structure models from recorded 2D sub-bottom profile datasets, an optimized data processing sequence was devised, comprising two stages: 2D data processing and 3D data processing. The 2D data processing stage involved spectrum analysis, band-pass filtering, matching filtering, time-varying gain, and surge correction. The subsequent 3D data processing stage encompassed ping location reallocation, static correction, and extraction of feature layer information. Notably, the final pseudo-3D sub-bottom profile time slice exhibited significant amplitude variations near the target body. This methodology represents an extension of the application of 2D sub-bottom profile data, enhancing the target recognition capabilities of such data. To further improve the precision of target body characterization, we used ArcScene 10.0 to create a 3D sub-bottom profile formation model spatial database. We constructed a submarine 3D formation structure model to show the 3D structural characteristics of the target body in detail and identified a seabed target body measuring 6.4 × 9.2 × 10 m.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141516855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-22DOI: 10.1007/s11600-024-01396-2
Ismailalwali Babikir, Mohamed Elsaadany
Seismic interpretation is a critical aspect of hydrocarbon exploration, where geoscientists often struggle to accurately recognize patterns and anomalies in large datasets. Machine learning techniques offer a promising solution by allowing for the quick and accurate analysis of multiple and large-size seismic volumes. This study leverages seismic facies analysis, seismic attribute analysis, and supervised machine learning to identify and characterize turbidite deposits in the Dangerous Grounds region, an underexplored area recently revealed by high-resolution broadband seismic data. Through seismic stratigraphy, two distinct phases of turbidite deposition were identified: a lower unit showing higher amplitude and signs of faulting effect, and an upper, present-day unit characterized by lower amplitude and continuous reflectors. The attribute expression of these turbidites shows strong amplitude response, high relative acoustic impedance, and high gray-level co-occurrence matrix entropy emphasizing their distinctiveness from surrounding facies, with variations in reflector continuity and spectral decomposition providing further insight into their depositional processes and sediment characteristics. By applying nine machine learning classifiers with twenty seismic attributes as input, this study achieved over 99% accuracy in distinguishing turbidite facies from background, with the neural network, random forest, K-nearest neighbors, decision tree, and support vector machine exhibiting optimal performance. The study contributes significantly to the regional understanding of turbidite deposits through detailed machine learning-aided seismic characterization. It underscores the value of integrating domain knowledge with machine learning techniques in enhancing subsurface interpretations, offering a comprehensive methodology for seismic facies analysis in similarly complex and underexplored regions.
{"title":"Machine learning-based seismic characterization of deepwater turbidites in the Dangerous Grounds area, Northwest Sabah, offshore Malaysia","authors":"Ismailalwali Babikir, Mohamed Elsaadany","doi":"10.1007/s11600-024-01396-2","DOIUrl":"https://doi.org/10.1007/s11600-024-01396-2","url":null,"abstract":"<p>Seismic interpretation is a critical aspect of hydrocarbon exploration, where geoscientists often struggle to accurately recognize patterns and anomalies in large datasets. Machine learning techniques offer a promising solution by allowing for the quick and accurate analysis of multiple and large-size seismic volumes. This study leverages seismic facies analysis, seismic attribute analysis, and supervised machine learning to identify and characterize turbidite deposits in the Dangerous Grounds region, an underexplored area recently revealed by high-resolution broadband seismic data. Through seismic stratigraphy, two distinct phases of turbidite deposition were identified: a lower unit showing higher amplitude and signs of faulting effect, and an upper, present-day unit characterized by lower amplitude and continuous reflectors. The attribute expression of these turbidites shows strong amplitude response, high relative acoustic impedance, and high gray-level co-occurrence matrix entropy emphasizing their distinctiveness from surrounding facies, with variations in reflector continuity and spectral decomposition providing further insight into their depositional processes and sediment characteristics. By applying nine machine learning classifiers with twenty seismic attributes as input, this study achieved over 99% accuracy in distinguishing turbidite facies from background, with the neural network, random forest, <i>K</i>-nearest neighbors, decision tree, and support vector machine exhibiting optimal performance. The study contributes significantly to the regional understanding of turbidite deposits through detailed machine learning-aided seismic characterization. It underscores the value of integrating domain knowledge with machine learning techniques in enhancing subsurface interpretations, offering a comprehensive methodology for seismic facies analysis in similarly complex and underexplored regions.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141516856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-19DOI: 10.1007/s11600-024-01383-7
Manuel J. Aguilar-Velázquez, Xyoli Pérez-Campos, Josué Tago, Carlos Villafuerte
Previous studies have suggested prominent variations in the seismic wave behavior at the 5 s period when traveling across the Valley of Mexico, associating them with the crustal structure and contributing to the anomalous seismic wave patterns observed each time an earthquake hits Mexico City. This article confirms the variations observed at 0.2 Hz by analyzing the Green tensor diagonal retrieved from empirical Green functions (EGF) calculated using seismic noise data cross-correlations of the vertical and horizontal components. We observe time and phase shifts between the east and north EGF components and show that they can be explained by the crustal structure from the surface up to 20 km depth; we also observe that the time and phase shifts are more significant if the distance between the source and the station increases. Additionally, the article presents an updated version of the velocity model from receiver functions and dispersion curves (VMRFDC v2.0) for the crustal structure under the Valley of Mexico. To validate this model, we compare the EGFs with synthetic Green functions determined numerically. To do so, we adaptatively meshed this model using an iterative algorithm to numerically simulate the impulse response up to 0.5 Hz. Finally, the comparisons between noise and synthetic EGF showed that the VMRFDC v2.0 model explains the time shifts and phase differences at 0.2 Hz, previously observed by independent studies, suggesting it correctly characterizes the crustal structure anomalies beneath the Valley of Mexico.
{"title":"Azimuthal crustal variations and their implications on the seismic impulse response in the Valley of Mexico","authors":"Manuel J. Aguilar-Velázquez, Xyoli Pérez-Campos, Josué Tago, Carlos Villafuerte","doi":"10.1007/s11600-024-01383-7","DOIUrl":"https://doi.org/10.1007/s11600-024-01383-7","url":null,"abstract":"<p>Previous studies have suggested prominent variations in the seismic wave behavior at the 5 s period when traveling across the Valley of Mexico, associating them with the crustal structure and contributing to the anomalous seismic wave patterns observed each time an earthquake hits Mexico City. This article confirms the variations observed at 0.2 Hz by analyzing the Green tensor diagonal retrieved from empirical Green functions (EGF) calculated using seismic noise data cross-correlations of the vertical and horizontal components. We observe time and phase shifts between the east and north EGF components and show that they can be explained by the crustal structure from the surface up to 20 km depth; we also observe that the time and phase shifts are more significant if the distance between the source and the station increases. Additionally, the article presents an updated version of the velocity model from receiver functions and dispersion curves (VMRFDC v2.0) for the crustal structure under the Valley of Mexico. To validate this model, we compare the EGFs with synthetic Green functions determined numerically. To do so, we adaptatively meshed this model using an iterative algorithm to numerically simulate the impulse response up to 0.5 Hz. Finally, the comparisons between noise and synthetic EGF showed that the VMRFDC v2.0 model explains the time shifts and phase differences at 0.2 Hz, previously observed by independent studies, suggesting it correctly characterizes the crustal structure anomalies beneath the Valley of Mexico.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141516857","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Due to the varieties, random distributions, and rich visual characteristics of the volcanic disaster scene, traditional methods fail to fully express the complex features of volcanic disaster scenes in remote sensing images. To tackle this problem, a new multi-instance network framework with the Shift Windows Transformer (i.e., Swin-T) and attention mechanism is used to classify the volcanic disaster scene from remote sensing images (MI-STA). Firstly, via aggregating the global contextual information of remote sensing image features, the Swin-T extracts the multi-scale hierarchical features of volcano disaster scenes from remote sensing images. Secondly, the channel attention module and spatial attention module fuse to extract the features of volcanic disaster scene to enhance the description and representation for the local details and global information in volcanic disaster scenes. Last, the importance weight of different example characteristics is scored to calculate the attributive probabilities of each instance. This study elaborates an experiment on the xBD dataset and gives comparisons with the commonly used deep network models. The results show that the overall classification accuracy of the proposed method achieves 92.46% and has good performance on the test dataset. Then, we further utilize our model to classify the volcanic disaster scenes of the specific Hunga Tonga-Hunga Ha’apai on January 15, 2022, and the classification images have good consistency with the existing literature. It provides a new approach for volcanic disaster monitoring by means of remote sensing image and has broad application prospects.
{"title":"Volcanic disaster scene classification of remote sensing image based on deep multi-instance network","authors":"Chengfan Li, Jingxin Han, Chengzhi Wu, Lan Liu, Xuefeng Liu, Junjuan Zhao","doi":"10.1007/s11600-024-01394-4","DOIUrl":"https://doi.org/10.1007/s11600-024-01394-4","url":null,"abstract":"<p>Due to the varieties, random distributions, and rich visual characteristics of the volcanic disaster scene, traditional methods fail to fully express the complex features of volcanic disaster scenes in remote sensing images. To tackle this problem, a new multi-instance network framework with the Shift Windows Transformer (i.e., Swin-T) and attention mechanism is used to classify the volcanic disaster scene from remote sensing images (MI-STA). Firstly, via aggregating the global contextual information of remote sensing image features, the Swin-T extracts the multi-scale hierarchical features of volcano disaster scenes from remote sensing images. Secondly, the channel attention module and spatial attention module fuse to extract the features of volcanic disaster scene to enhance the description and representation for the local details and global information in volcanic disaster scenes. Last, the importance weight of different example characteristics is scored to calculate the attributive probabilities of each instance. This study elaborates an experiment on the xBD dataset and gives comparisons with the commonly used deep network models. The results show that the overall classification accuracy of the proposed method achieves 92.46% and has good performance on the test dataset. Then, we further utilize our model to classify the volcanic disaster scenes of the specific Hunga Tonga-Hunga Ha’apai on January 15, 2022, and the classification images have good consistency with the existing literature. It provides a new approach for volcanic disaster monitoring by means of remote sensing image and has broad application prospects.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141529370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-18DOI: 10.1007/s11600-024-01395-3
Barbara Lolli, Gianfranco Vannucci, Paolo Gasperini
The Italian Seismological Instrumental and Parametric Database (ISIDe) is the recipient of earthquake data collected in real-time by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), and used by the studies of earthquake forecasting and seismic hazard assessment in Italy in the last decade. When it went online, following a significant improvement of the seismic acquisition system of INGV, it was including only data since the second fortnight of April 2005. About ten years later, the data since the beginning of 1985 suddenly appeared without any prior notice than the updating of the starting date of the dataset. However, the characteristics of the added data appeared clearly different from the following period both in terms of the numbers of located earthquakes and of types of magnitudes provided. After having analyzed the numerical consistency and the calibration of magnitudes of ISIDe as a function of time from 1985 to 15 April 2005, we can say that such a dataset is incomplete and poorly calibrated compared to other catalogs of Italian seismicity (CSTI, CSI, and HORUS) available for the same period. Hence, we suggest not using it as is for statistical analyses of Italian seismicity. However, it provides some magnitudes that are missed by other catalogs and thus might be used for improving such catalogs.
{"title":"Completeness and calibration of the Italian Seismological Instrumental and Parametric Database (ISIDe) before 16 April 2005","authors":"Barbara Lolli, Gianfranco Vannucci, Paolo Gasperini","doi":"10.1007/s11600-024-01395-3","DOIUrl":"https://doi.org/10.1007/s11600-024-01395-3","url":null,"abstract":"<p>The Italian Seismological Instrumental and Parametric Database (ISIDe) is the recipient of earthquake data collected in real-time by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), and used by the studies of earthquake forecasting and seismic hazard assessment in Italy in the last decade. When it went online, following a significant improvement of the seismic acquisition system of INGV, it was including only data since the second fortnight of April 2005. About ten years later, the data since the beginning of 1985 suddenly appeared without any prior notice than the updating of the starting date of the dataset. However, the characteristics of the added data appeared clearly different from the following period both in terms of the numbers of located earthquakes and of types of magnitudes provided. After having analyzed the numerical consistency and the calibration of magnitudes of ISIDe as a function of time from 1985 to 15 April 2005, we can say that such a dataset is incomplete and poorly calibrated compared to other catalogs of Italian seismicity (CSTI, CSI, and HORUS) available for the same period. Hence, we suggest not using it as is for statistical analyses of Italian seismicity. However, it provides some magnitudes that are missed by other catalogs and thus might be used for improving such catalogs.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141516858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-03DOI: 10.1007/s11600-024-01380-w
Fatih Kadi
The district of Maçka in Trabzon, in the Eastern Black Sea Region of Turkey, frequently experiences landslides, resulting in the highest number of disaster victims. In this study, Landslide Susceptibility Maps (LSMs) were generated via the Statistical-based Frequency Ratio (FR) and Modified Information Value (MIV) models using 10 factors. Out of the 150 landslides in the region, 105 (70%) were utilized in creating the maps, and the remaining 45 (30%) were reserved for validation. The models demonstrated success rates of 87.5% and 84.9%, along with prediction rates of 84.8% and 83.1%, respectively, as determined by the receiver operating characteristics curve and area under the curve values. While both models achieved acceptable levels of accuracy, MIV outperformed FR. Additionally, the risk status of 5413 buildings and forested areas was examined. The results showed that 78.64% (FR) and 80.79% (MIV) of the buildings were situated in high landslide risk areas. Regarding forest areas, 39.30% (FR) and 41.35% (MIV) were observed in high-risk landslide areas. In the next step, neighborhood landslide risk statuses were examined, revealing risks ranging from 90 to 100% in some areas. The final step concentrated on risk analyses for construction plans in a chosen pilot neighborhood using two criteria. 88.75% of all parcels were observed in high-risk areas, with hazelnut groves at 79.67% in high-risk zones. Conversely, 71.89% of fruit trees were in low-risk areas. The results align with the literature, indicating that LSMs can serve as a versatile base map.
{"title":"Statistical-based models for the production of landslide susceptibility maps and general risk analyses: a case study in Maçka, Turkey","authors":"Fatih Kadi","doi":"10.1007/s11600-024-01380-w","DOIUrl":"https://doi.org/10.1007/s11600-024-01380-w","url":null,"abstract":"<p>The district of Maçka in Trabzon, in the Eastern Black Sea Region of Turkey, frequently experiences landslides, resulting in the highest number of disaster victims. In this study, Landslide Susceptibility Maps (LSMs) were generated via the Statistical-based Frequency Ratio (FR) and Modified Information Value (MIV) models using 10 factors. Out of the 150 landslides in the region, 105 (70%) were utilized in creating the maps, and the remaining 45 (30%) were reserved for validation. The models demonstrated success rates of 87.5% and 84.9%, along with prediction rates of 84.8% and 83.1%, respectively, as determined by the receiver operating characteristics curve and area under the curve values. While both models achieved acceptable levels of accuracy, MIV outperformed FR. Additionally, the risk status of 5413 buildings and forested areas was examined. The results showed that 78.64% (FR) and 80.79% (MIV) of the buildings were situated in high landslide risk areas. Regarding forest areas, 39.30% (FR) and 41.35% (MIV) were observed in high-risk landslide areas. In the next step, neighborhood landslide risk statuses were examined, revealing risks ranging from 90 to 100% in some areas. The final step concentrated on risk analyses for construction plans in a chosen pilot neighborhood using two criteria. 88.75% of all parcels were observed in high-risk areas, with hazelnut groves at 79.67% in high-risk zones. Conversely, 71.89% of fruit trees were in low-risk areas. The results align with the literature, indicating that LSMs can serve as a versatile base map.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141252239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-03DOI: 10.1007/s11600-024-01379-3
Jianan Yang, Pengxian Fan, Hui Gao, Lu Dong
The Poisson’s ratio of hard rock exhibits a marked stress dependence, which is contrary to its mechanical definition as an elastic constant. Thus, it is of great importance to determine the Poisson’s ratio through a reasonable method. To investigate the Poisson effect of multiple types of hard rocks (sandstone, basalt, granite, and marble), the uniaxial loading–unloading tests are carried out. The test results indicate that whether the tangent Poisson’s ratio or the average Poisson’s ratio, all gradually increases with the stress level. And the stress dependence of the average Poisson’s ratio under the unloading path is reduced, which is significant in the low and medium stress intervals. Appropriately increasing the number of loading–unloading cycles can also improve the stability of the average Poisson’s ratio to some extent. Based on this, a new method for testing the average Poisson’s ratio is proposed, which can effectively exclude the effect of irreversible displacement of rocks and improve the stability of the average Poisson’s ratio. The test procedure is simple and has good application prospects.
{"title":"Variation of Poisson’s ratio of hard rocks during compression and an innovative determination method based on axial loading–unloading test","authors":"Jianan Yang, Pengxian Fan, Hui Gao, Lu Dong","doi":"10.1007/s11600-024-01379-3","DOIUrl":"https://doi.org/10.1007/s11600-024-01379-3","url":null,"abstract":"<p>The Poisson’s ratio of hard rock exhibits a marked stress dependence, which is contrary to its mechanical definition as an elastic constant. Thus, it is of great importance to determine the Poisson’s ratio through a reasonable method. To investigate the Poisson effect of multiple types of hard rocks (sandstone, basalt, granite, and marble), the uniaxial loading–unloading tests are carried out. The test results indicate that whether the tangent Poisson’s ratio or the average Poisson’s ratio, all gradually increases with the stress level. And the stress dependence of the average Poisson’s ratio under the unloading path is reduced, which is significant in the low and medium stress intervals. Appropriately increasing the number of loading–unloading cycles can also improve the stability of the average Poisson’s ratio to some extent. Based on this, a new method for testing the average Poisson’s ratio is proposed, which can effectively exclude the effect of irreversible displacement of rocks and improve the stability of the average Poisson’s ratio. The test procedure is simple and has good application prospects.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141252343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-01DOI: 10.1007/s11600-024-01381-9
Xianda Feng, Jiazhi He, Bin Lu
Accurate prediction of soil liquefaction is important for preventing geological disasters. Soil liquefaction prediction models based on machine learning algorithms are efficient and accurate; however, some models fail to achieve highly precise soil liquefaction predictions in certain areas because of poor generalizability, which limits their applicability. Thus, a soil liquefaction prediction model was constructed using the CatBoost (CB) algorithm to support categorical features. The model was trained using standard liquefaction datasets from domestic and foreign sources and was optimized with Optuna hyperparameters. Additionally, the model was evaluated using five evaluation metrics and its performance was compared to that of other models that use multi-layer perceptron, support vector machine, random forest, and XGBoost algorithms. Finally, the prediction capability of the model was verified using three case studies. Experimental results demonstrated that the CB-based model generated more accurate soil liquefaction predictions than other comparison models and maintained their performance. Hence, the proposed model accurately predicts soil liquefaction and offers strong generalizability, demonstrating the potential to contribute toward the prevention and control of soil liquefaction in engineering projects, and toward ensuring the safety and stability of structures built on or near liquefiable soils.
{"title":"Accurate and generalizable soil liquefaction prediction model based on the CatBoost algorithm","authors":"Xianda Feng, Jiazhi He, Bin Lu","doi":"10.1007/s11600-024-01381-9","DOIUrl":"https://doi.org/10.1007/s11600-024-01381-9","url":null,"abstract":"<p>Accurate prediction of soil liquefaction is important for preventing geological disasters. Soil liquefaction prediction models based on machine learning algorithms are efficient and accurate; however, some models fail to achieve highly precise soil liquefaction predictions in certain areas because of poor generalizability, which limits their applicability. Thus, a soil liquefaction prediction model was constructed using the CatBoost (CB) algorithm to support categorical features. The model was trained using standard liquefaction datasets from domestic and foreign sources and was optimized with Optuna hyperparameters. Additionally, the model was evaluated using five evaluation metrics and its performance was compared to that of other models that use multi-layer perceptron, support vector machine, random forest, and XGBoost algorithms. Finally, the prediction capability of the model was verified using three case studies. Experimental results demonstrated that the CB-based model generated more accurate soil liquefaction predictions than other comparison models and maintained their performance. Hence, the proposed model accurately predicts soil liquefaction and offers strong generalizability, demonstrating the potential to contribute toward the prevention and control of soil liquefaction in engineering projects, and toward ensuring the safety and stability of structures built on or near liquefiable soils.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141187976","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper aims to present the mechanism of scour and empirical equations for evaluating local scour with and without a countermeasure around the bridge pier. A critical review of scour countermeasures, mainly hydraulic, structural, and biotechnical, extending to the present time is done. Hydraulic countermeasures consist of river training structures and bed armoring. Structures placed parallel, perpendicular, or at an angle to the flow aiming to modify it is the purpose of river training works. Armoring is done through the use of riprap, partially grouted riprap, cable-tied blocks, grout-filled containers, and gabions. Structural countermeasures include foundation strengthening and pier geometry modifications. Extending footings, underpinning, and pile- underpinning are related to foundation strengthening, while pier geometry modifications include different pier features such as shapes, textures, slots, and collars. Biotechnical countermeasures include using vegetation riprap, geosynthetic polymer, live staking, and bio-stabilization using extracellular polymeric substances. Different combinations of countermeasures are also discussed. In hydraulic and structural countermeasures, riprap and collars are most commonly used due to their efficiency in scour reduction and economic feasibility. Bio-stabilization using extracellular polymeric substances is a novel measure for scour prevention. From the literature, it is concluded that pier modifications are the most effective and active area of research in which lenticular pier shape, lenticular hooked, and airfoil-shaped collar are best suited for reducing the local scour around the pier. Finally, the limitations of the countermeasures mentioned above are presented.
{"title":"Countermeasures for local scour around the bridge pier: a review","authors":"Mangu Rahul Bharadwaj, Lav Kumar Gupta, Manish Pandey, Manousos Valyrakis","doi":"10.1007/s11600-024-01361-z","DOIUrl":"https://doi.org/10.1007/s11600-024-01361-z","url":null,"abstract":"<p>This paper aims to present the mechanism of scour and empirical equations for evaluating local scour with and without a countermeasure around the bridge pier. A critical review of scour countermeasures, mainly hydraulic, structural, and biotechnical, extending to the present time is done. Hydraulic countermeasures consist of river training structures and bed armoring. Structures placed parallel, perpendicular, or at an angle to the flow aiming to modify it is the purpose of river training works. Armoring is done through the use of riprap, partially grouted riprap, cable-tied blocks, grout-filled containers, and gabions. Structural countermeasures include foundation strengthening and pier geometry modifications. Extending footings, underpinning, and pile- underpinning are related to foundation strengthening, while pier geometry modifications include different pier features such as shapes, textures, slots, and collars. Biotechnical countermeasures include using vegetation riprap, geosynthetic polymer, live staking, and bio-stabilization using extracellular polymeric substances. Different combinations of countermeasures are also discussed. In hydraulic and structural countermeasures, riprap and collars are most commonly used due to their efficiency in scour reduction and economic feasibility. Bio-stabilization using extracellular polymeric substances is a novel measure for scour prevention. From the literature, it is concluded that pier modifications are the most effective and active area of research in which lenticular pier shape, lenticular hooked, and airfoil-shaped collar are best suited for reducing the local scour around the pier. Finally, the limitations of the countermeasures mentioned above are presented.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141187916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-31DOI: 10.1007/s11600-024-01388-2
JiYuan Liu, Fei Wang, ChengEn Zhang, Yong Zhang, Tao Li
This paper aims to explore the application of artificial intelligence in the petroleum industry, with a specific focus on oil well production forecasting. The study utilizes the Zananor field as a case study, systematically organizing raw data, categorizing different well instances and production stages in detail, and normalizing the data. An individual long short-term memory (LSTM) neural network model is constructed with monthly oil production data as input to predict the monthly oil production of the experimental oilfield. Furthermore, a multivariate LSTM neural network model is introduced, incorporating different production data as input sets to enhance the accuracy of monthly oil production predictions. A comparative analysis is conducted with particle swarm optimization optimized recurrent neural network results. Finally, gray relational analysis and principal component analysis methods are compared in feature selection. Experimental results demonstrate that the LSTM model is more suitable for the study area, and the multivariate model outperforms the univariate model in terms of prediction accuracy, especially for monthly oil production. Additionally, gray relational analysis exhibits higher accuracy and greater applicability in feature selection compared to principal component analysis. These research findings provide valuable guidance for production forecasting and operational optimization in the petroleum industry.
{"title":"Reservoir production capacity prediction of Zananor field based on LSTM neural network","authors":"JiYuan Liu, Fei Wang, ChengEn Zhang, Yong Zhang, Tao Li","doi":"10.1007/s11600-024-01388-2","DOIUrl":"https://doi.org/10.1007/s11600-024-01388-2","url":null,"abstract":"<p>This paper aims to explore the application of artificial intelligence in the petroleum industry, with a specific focus on oil well production forecasting. The study utilizes the Zananor field as a case study, systematically organizing raw data, categorizing different well instances and production stages in detail, and normalizing the data. An individual long short-term memory (LSTM) neural network model is constructed with monthly oil production data as input to predict the monthly oil production of the experimental oilfield. Furthermore, a multivariate LSTM neural network model is introduced, incorporating different production data as input sets to enhance the accuracy of monthly oil production predictions. A comparative analysis is conducted with particle swarm optimization optimized recurrent neural network results. Finally, gray relational analysis and principal component analysis methods are compared in feature selection. Experimental results demonstrate that the LSTM model is more suitable for the study area, and the multivariate model outperforms the univariate model in terms of prediction accuracy, especially for monthly oil production. Additionally, gray relational analysis exhibits higher accuracy and greater applicability in feature selection compared to principal component analysis. These research findings provide valuable guidance for production forecasting and operational optimization in the petroleum industry.</p>","PeriodicalId":6988,"journal":{"name":"Acta Geophysica","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141188046","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}