Yang Shen, Xiao-lin Hu, Tong-dong Wang, Jia-jia Cui, Si-hao Tao, Ao Li, Qiang Lu, De-zhi Zhang, Wei-guo Xiao
{"title":"CNN-based automated trace editing method using Hough transform","authors":"Yang Shen, Xiao-lin Hu, Tong-dong Wang, Jia-jia Cui, Si-hao Tao, Ao Li, Qiang Lu, De-zhi Zhang, Wei-guo Xiao","doi":"10.1007/s11770-023-1068-1","DOIUrl":null,"url":null,"abstract":"<p>Seismic trace editing is a tedious process in data preprocessing that can incur high time costs, especially when handling large 3D datasets. In addition, existing methods to edit seismic traces may miss vital information when killing noisy traces simply. Thus, in this paper, we propose an automated method to edit seismic traces based on machine learning. The proposed method combines the Hough transform technique and a convolutional neural network (CNN) to improve the feasibility of the scheme. The Hough transform is a feature extraction technique that helps identify anomaly lines in images, and we employ it in the proposed method to ascertain the prospective positions of noisy and bad traces. We then implement a bandpass filter and the trained CNN model to identify the precise noisy traces in the target region indicated by the Hough transform process. Upon identification, automated processing is applied to determine whether the processed traces can be useful or should be discarded. This comprehensive framework includes four main steps, i.e., data preprocessing, Hough transform detection, network training, and network prediction. Experiments conducted on real-world data yielded 98% accuracy, which indicates the potential efficacy of the proposed automated trace editing method in practical applications.</p>","PeriodicalId":55500,"journal":{"name":"Applied Geophysics","volume":"8 1","pages":""},"PeriodicalIF":0.7000,"publicationDate":"2024-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Geophysics","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1007/s11770-023-1068-1","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"GEOCHEMISTRY & GEOPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
Seismic trace editing is a tedious process in data preprocessing that can incur high time costs, especially when handling large 3D datasets. In addition, existing methods to edit seismic traces may miss vital information when killing noisy traces simply. Thus, in this paper, we propose an automated method to edit seismic traces based on machine learning. The proposed method combines the Hough transform technique and a convolutional neural network (CNN) to improve the feasibility of the scheme. The Hough transform is a feature extraction technique that helps identify anomaly lines in images, and we employ it in the proposed method to ascertain the prospective positions of noisy and bad traces. We then implement a bandpass filter and the trained CNN model to identify the precise noisy traces in the target region indicated by the Hough transform process. Upon identification, automated processing is applied to determine whether the processed traces can be useful or should be discarded. This comprehensive framework includes four main steps, i.e., data preprocessing, Hough transform detection, network training, and network prediction. Experiments conducted on real-world data yielded 98% accuracy, which indicates the potential efficacy of the proposed automated trace editing method in practical applications.
期刊介绍:
The journal is designed to provide an academic realm for a broad blend of academic and industry papers to promote rapid communication and exchange of ideas between Chinese and world-wide geophysicists.
The publication covers the applications of geoscience, geophysics, and related disciplines in the fields of energy, resources, environment, disaster, engineering, information, military, and surveying.