Efficient SNR enhancement model for severely contaminated DAS seismic data based on heterogeneous knowledge distillation

GEOPHYSICS Pub Date : 2024-01-18 DOI:10.1190/geo2023-0382.1
Q. Feng, Shignag Wang, Yue Li
{"title":"Efficient SNR enhancement model for severely contaminated DAS seismic data based on heterogeneous knowledge distillation","authors":"Q. Feng, Shignag Wang, Yue Li","doi":"10.1190/geo2023-0382.1","DOIUrl":null,"url":null,"abstract":"Distributed acoustic sensing (DAS) is an emerging seismic acquisition technique with great practical potential. However, various types of noise seriously corrupt DAS signals, making it difficult to recover signals, particularly in low SNR regions. Existing deep learning methods address this challenge by augmenting datasets or strengthening the complex architecture, which can cause over-denoising and a computational power burden. Hence, we propose the heterogeneous knowledge distillation (HKD) method to more efficiently address the signal reconstruction under low SNR. HKD employs ResNet 20 as the teacher and student model (T-S). It utilizes residual learning and skip connections to facilitate feature representation at deeper levels. The main contribution is the training of the T-S framework with different noise levels. The teacher model that was trained using slightly noisy data serves as a powerful feature extractor to capture more accurate signal features, since high quality data is easy to recover. By minimizing the difference between the outputs of T-S models, the student that was trained using severely noisy data can distill the absent signal features from the teacher to improve its own signal recovery, which enables heterogeneous feature distillation. Furthermore, simultaneous learning of negative and positive components (PNL) has been proposed to extract more useful features from the teacher, enabling the T-S framework to learn from both the predicted signal and noise during training. Consequently, a new loss function that combines student denoising loss and HKD loss weighted by PNL was developed to alleviate signal leakage. The experimental results demonstrate that the HKD achieves distinct and consistent signal recovery without increasing computational costs.","PeriodicalId":509604,"journal":{"name":"GEOPHYSICS","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"GEOPHYSICS","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1190/geo2023-0382.1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Distributed acoustic sensing (DAS) is an emerging seismic acquisition technique with great practical potential. However, various types of noise seriously corrupt DAS signals, making it difficult to recover signals, particularly in low SNR regions. Existing deep learning methods address this challenge by augmenting datasets or strengthening the complex architecture, which can cause over-denoising and a computational power burden. Hence, we propose the heterogeneous knowledge distillation (HKD) method to more efficiently address the signal reconstruction under low SNR. HKD employs ResNet 20 as the teacher and student model (T-S). It utilizes residual learning and skip connections to facilitate feature representation at deeper levels. The main contribution is the training of the T-S framework with different noise levels. The teacher model that was trained using slightly noisy data serves as a powerful feature extractor to capture more accurate signal features, since high quality data is easy to recover. By minimizing the difference between the outputs of T-S models, the student that was trained using severely noisy data can distill the absent signal features from the teacher to improve its own signal recovery, which enables heterogeneous feature distillation. Furthermore, simultaneous learning of negative and positive components (PNL) has been proposed to extract more useful features from the teacher, enabling the T-S framework to learn from both the predicted signal and noise during training. Consequently, a new loss function that combines student denoising loss and HKD loss weighted by PNL was developed to alleviate signal leakage. The experimental results demonstrate that the HKD achieves distinct and consistent signal recovery without increasing computational costs.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于异构知识提炼的严重污染 DAS 地震数据信噪比高效增强模型
分布式声学传感(DAS)是一种新兴的地震采集技术,具有巨大的实用潜力。然而,各种类型的噪声严重破坏了 DAS 信号,导致信号难以恢复,尤其是在信噪比较低的区域。现有的深度学习方法通过增强数据集或强化复杂架构来应对这一挑战,但这可能会造成过度去噪和计算能力负担。因此,我们提出了异构知识蒸馏(HKD)方法,以更有效地解决低信噪比下的信号重建问题。HKD 采用 ResNet 20 作为教师和学生模型(T-S)。它利用残差学习和跳接来促进更深层次的特征表示。其主要贡献是在不同噪声水平下训练 T-S 框架。使用轻微噪声数据训练的教师模型可以作为强大的特征提取器,捕捉更准确的信号特征,因为高质量的数据很容易恢复。通过最小化 T-S 模型输出之间的差异,使用严重噪声数据训练的学生模型可以从教师模型中提炼出不存在的信号特征,从而提高自身的信号恢复能力,这就实现了异构特征提炼。此外,还提出了同时学习负分量和正分量(PNL)的方法,以从教师那里提取更多有用的特征,从而使 T-S 框架在训练过程中既能从预测信号中学习,也能从噪声中学习。因此,我们开发了一种新的损失函数,它结合了学生去噪损失和由 PNL 加权的 HKD 损失,以减少信号泄漏。实验结果表明,HKD 在不增加计算成本的情况下实现了明显而一致的信号恢复。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Robust unsupervised 5D seismic data reconstruction on both regular and irregular grid Effect of fluid patch clustering on the P-wave velocity-saturation relation: a critical saturation model Strategic Geosteering Workflow with Uncertainty Quantification and Deep Learning: Initial Test on the Goliat Field Data Review on 3D electromagnetic modeling and inversion for Mineral Exploration High dynamic range land wavefield reconstruction from randomized acquisition
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1