TDU-DLNet: A transformer-based deep unfolding network for dictionary learning

IF 3.6 2区 工程技术 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC Signal Processing Pub Date : 2025-06-01 Epub Date: 2025-01-13 DOI:10.1016/j.sigpro.2025.109886
Kai Wu , Jing Dong , Guifu Hu , Chang Liu , Wenwu Wang
{"title":"TDU-DLNet: A transformer-based deep unfolding network for dictionary learning","authors":"Kai Wu ,&nbsp;Jing Dong ,&nbsp;Guifu Hu ,&nbsp;Chang Liu ,&nbsp;Wenwu Wang","doi":"10.1016/j.sigpro.2025.109886","DOIUrl":null,"url":null,"abstract":"<div><div>Deep unfolding attempts to leverage the interpretability of traditional model-based algorithms and the learning ability of deep neural networks by unrolling model-based algorithms as neural networks. Following the framework of deep unfolding, some conventional dictionary learning algorithms have been expanded as networks. However, existing deep unfolding networks for dictionary learning are developed based on formulations with pre-defined priors, e.g., <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>-norm, or learn priors using convolutional neural networks with limited receptive fields. To address these issues, we propose a transformer-based deep unfolding network for dictionary learning (TDU-DLNet). The network is developed by unrolling a general formulation of dictionary learning with an implicit prior of representation coefficients. The prior is learned by a transformer-based network where an inter-stage feature fusion module is introduced to decrease information loss among stages. The effectiveness and superiority of the proposed method are validated on image denoising. Experiments based on widely used datasets demonstrate that the proposed method achieves competitive results with fewer parameters as compared with deep learning and other deep unfolding methods.</div></div>","PeriodicalId":49523,"journal":{"name":"Signal Processing","volume":"231 ","pages":"Article 109886"},"PeriodicalIF":3.6000,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165168425000015","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/13 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Deep unfolding attempts to leverage the interpretability of traditional model-based algorithms and the learning ability of deep neural networks by unrolling model-based algorithms as neural networks. Following the framework of deep unfolding, some conventional dictionary learning algorithms have been expanded as networks. However, existing deep unfolding networks for dictionary learning are developed based on formulations with pre-defined priors, e.g., 1-norm, or learn priors using convolutional neural networks with limited receptive fields. To address these issues, we propose a transformer-based deep unfolding network for dictionary learning (TDU-DLNet). The network is developed by unrolling a general formulation of dictionary learning with an implicit prior of representation coefficients. The prior is learned by a transformer-based network where an inter-stage feature fusion module is introduced to decrease information loss among stages. The effectiveness and superiority of the proposed method are validated on image denoising. Experiments based on widely used datasets demonstrate that the proposed method achieves competitive results with fewer parameters as compared with deep learning and other deep unfolding methods.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
TDU-DLNet:一种基于变压器的字典学习深度展开网络
深度展开试图通过将基于模型的算法展开为神经网络来利用传统基于模型的算法的可解释性和深度神经网络的学习能力。在深度展开的框架下,一些传统的字典学习算法被扩展为网络。然而,现有的用于字典学习的深度展开网络是基于预定义先验的公式开发的,例如,1-范数,或者使用具有有限接受域的卷积神经网络学习先验。为了解决这些问题,我们提出了一种基于变压器的字典学习深度展开网络(TDU-DLNet)。该网络是通过展开具有隐式先验表示系数的字典学习的一般公式来开发的。在变压器网络中引入了级间特征融合模块,减少了级间信息的丢失。在图像去噪实验中验证了该方法的有效性和优越性。基于广泛使用的数据集的实验表明,与深度学习和其他深度展开方法相比,该方法以较少的参数获得了具有竞争力的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Signal Processing
Signal Processing 工程技术-工程:电子与电气
CiteScore
9.20
自引率
9.10%
发文量
309
审稿时长
41 days
期刊介绍: Signal Processing incorporates all aspects of the theory and practice of signal processing. It features original research work, tutorial and review articles, and accounts of practical developments. It is intended for a rapid dissemination of knowledge and experience to engineers and scientists working in the research, development or practical application of signal processing. Subject areas covered by the journal include: Signal Theory; Stochastic Processes; Detection and Estimation; Spectral Analysis; Filtering; Signal Processing Systems; Software Developments; Image Processing; Pattern Recognition; Optical Signal Processing; Digital Signal Processing; Multi-dimensional Signal Processing; Communication Signal Processing; Biomedical Signal Processing; Geophysical and Astrophysical Signal Processing; Earth Resources Signal Processing; Acoustic and Vibration Signal Processing; Data Processing; Remote Sensing; Signal Processing Technology; Radar Signal Processing; Sonar Signal Processing; Industrial Applications; New Applications.
期刊最新文献
Diffusion model-based covariance matrix recovery method for DOA estimation Joint estimation of sea state and vessel parameters using a mass-spring-damper equivalence model An underwater acoustic target recognition model based on heterogeneous spectral attention feature fusion Target scattering extraction via non-coherent collaboration using radar sensor network Variational time-frequency mode tracking for micro-Doppler signature extraction
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1