Weight sharing for single-channel LMS

Shamahil Ibunu, Karl Moore, C. C. Took, Danilo P. Mandic
{"title":"Weight sharing for single-channel LMS","authors":"Shamahil Ibunu, Karl Moore, C. C. Took, Danilo P. Mandic","doi":"10.1109/SSP53291.2023.10207966","DOIUrl":null,"url":null,"abstract":"Constraining a group of taps of an adaptive filter to a single value may seem like a futile task, as weight sharing reduces the degree of freedom of the algorithm, and there are no obvious advantages for implementing such an update scheme. On the other hand, weight sharing is popular in deep learning and underpins the success of convolutional neural networks (CNNs) in numerous applications. To this end, we investigate the advantages of weight sharing in single-channel least mean square (LMS), and propose weight sharing LMS (WSLMS) and partial weight sharing LMS (PWS). In particular, we illustrate how weight sharing can lead to numerous benefits such as an enhanced robustness to noise and a computational cost that is independent of the filter length. Simulations support the analysis.","PeriodicalId":296346,"journal":{"name":"2023 IEEE Statistical Signal Processing Workshop (SSP)","volume":"76 3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Statistical Signal Processing Workshop (SSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSP53291.2023.10207966","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Constraining a group of taps of an adaptive filter to a single value may seem like a futile task, as weight sharing reduces the degree of freedom of the algorithm, and there are no obvious advantages for implementing such an update scheme. On the other hand, weight sharing is popular in deep learning and underpins the success of convolutional neural networks (CNNs) in numerous applications. To this end, we investigate the advantages of weight sharing in single-channel least mean square (LMS), and propose weight sharing LMS (WSLMS) and partial weight sharing LMS (PWS). In particular, we illustrate how weight sharing can lead to numerous benefits such as an enhanced robustness to noise and a computational cost that is independent of the filter length. Simulations support the analysis.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
单通道LMS的权重共享
将自适应过滤器的一组点击限制为单个值似乎是徒劳的任务,因为权重共享降低了算法的自由度,并且实现这种更新方案没有明显的优势。另一方面,权重共享在深度学习中很流行,并且是卷积神经网络(cnn)在众多应用中取得成功的基础。为此,我们研究了单通道最小均方(LMS)中权值共享的优势,并提出了权值共享LMS (WSLMS)和部分权值共享LMS (PWS)。特别是,我们说明了权重共享如何带来许多好处,例如增强对噪声的鲁棒性和与滤波器长度无关的计算成本。仿真结果支持了这一分析。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Ultra Low Delay Audio Source Separation Using Zeroth-Order Optimization Joint Channel Estimation and Symbol Detection in Overloaded MIMO Using ADMM Performance Analysis and Deep Learning Evaluation of URLLC Full-Duplex Energy Harvesting IoT Networks over Nakagami-m Fading Channels Accelerated Magnetic Resonance Parameter Mapping With Low-Rank Modeling and Deep Generative Priors Physical Characteristics Estimation for Irregularly Shaped Fruit Using Two Cameras
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1