Taneesh Gupta , Paul Zwartjes , Udbhav Bamba , Koustav Ghosal , Deepak K. Gupta
{"title":"Near-surface velocity estimation using shear-waves and deep-learning with a U-net trained on synthetic data","authors":"Taneesh Gupta , Paul Zwartjes , Udbhav Bamba , Koustav Ghosal , Deepak K. Gupta","doi":"10.1016/j.aiig.2023.01.001","DOIUrl":null,"url":null,"abstract":"<div><p>Estimation of good velocity models under complex near-surface conditions remains a topic of ongoing research. We propose to predict near-surface velocity profiles from surface-waves transformed to phase velocity-frequency panels in a data-driven manner using deep neural networks. This is a different approach from many recent works that attempt to estimate velocity from directly reflected body waves or guided waves. A secondary objective is to analyze the influence on the prediction accuracy of various commonly employed deep learning practices, such as transfer learning and data augmentations. Through numerical experiments on synthetic data as well as a real geophysical example, we demonstrate that transfer learning as well as data augmentations are helpful when using deep learning for velocity estimation. A third and final objective is to study lack of generalization of deep learning models for out-of-distribution (OOD) data in the context of our problem, and present a novel approach to tackle it. We propose a domain adaptation network for training deep learning models that uses a priori knowledge on the range of velocity values in order to constrain mapping of the output. The final comparison on field data, which was not part of the training data, show the deep neural network predictions compare favorably with a conventional velocity model estimation obtained with a dispersion curve inversion workflow.</p></div>","PeriodicalId":100124,"journal":{"name":"Artificial Intelligence in Geosciences","volume":"3 ","pages":"Pages 209-224"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2666544123000011/pdfft?md5=46d6b925b7d294d096526d5cf8ce1950&pid=1-s2.0-S2666544123000011-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence in Geosciences","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666544123000011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Estimation of good velocity models under complex near-surface conditions remains a topic of ongoing research. We propose to predict near-surface velocity profiles from surface-waves transformed to phase velocity-frequency panels in a data-driven manner using deep neural networks. This is a different approach from many recent works that attempt to estimate velocity from directly reflected body waves or guided waves. A secondary objective is to analyze the influence on the prediction accuracy of various commonly employed deep learning practices, such as transfer learning and data augmentations. Through numerical experiments on synthetic data as well as a real geophysical example, we demonstrate that transfer learning as well as data augmentations are helpful when using deep learning for velocity estimation. A third and final objective is to study lack of generalization of deep learning models for out-of-distribution (OOD) data in the context of our problem, and present a novel approach to tackle it. We propose a domain adaptation network for training deep learning models that uses a priori knowledge on the range of velocity values in order to constrain mapping of the output. The final comparison on field data, which was not part of the training data, show the deep neural network predictions compare favorably with a conventional velocity model estimation obtained with a dispersion curve inversion workflow.