Elastic full-waveform inversion (EFWI) can provide high-resolution subsurface structures and physical properties by iteratively matching observed and synthetic data. However, the success of EFWI relies on the availability of a good initial model and high signal-to-noise ratio observed data with sufficient low-frequency information, both of which are often challenging to obtain in practical applications. In addition, the coupling of different parameters degrades the inversion result. Recently, inversion methods based on physics-informed deep neural networks (DNN) have proven effective in mitigating the issue of multiple local minima caused by inaccurate initial models, missing low-frequency information, and noisy seismic data. However, existing DNN-based approaches commonly rely on fixed activation functions (e.g., rectified linear unit). In addition, their capacity to represent high-frequency components – namely, fine-scale structural details – is inherently limited due to spectral bias. These limitations may, in turn, impede their broader applicability. To mitigate this issue, we propose a model reparameterized EFWI method based on a dual-channel convolutional neural network (CNN) and Kolmogorov–Arnold networks (KAN) to enhance the reconstruction of fine-scale structural details. Specifically, our network incorporates KAN into the U-Net architecture, where CNN and KAN operate in dual channels to efficiently capture nonlinear relationships in the data. The hybrid network maps an initial model to the subsurface parameter model, with the output of the network serving as input for partial differential equations (PDEs) to generate synthetic data. Various numerical examples are conducted to investigate the performance of the inversion method, including its ability to mitigate the parameter crosstalk issue, the effect of noise and missing low-frequency information, and the influence of different initial models and network inputs. The numerical results demonstrate that, by combining CNN’s fixed activation functions with KAN’s inherently learnable activations, our method – despite a modest increase in computational cost – outperforms both EFWI and CNN-based reparameterized EFWI in reconstruction accuracy and convergence efficiency.
扫码关注我们
求助内容:
应助结果提醒方式:
