基于信号处理的一种改进的、高度并行的秩一特征向量更新方法

R. DeGroat, R. Roberts
{"title":"基于信号处理的一种改进的、高度并行的秩一特征向量更新方法","authors":"R. DeGroat, R. Roberts","doi":"10.1109/ICASSP.1987.1169500","DOIUrl":null,"url":null,"abstract":"In this paper, we discuss rank-one eigenvector updating schemes that are appropriate for tracking time-varying, narrow-band signals in noise. We show that significant reductions in computation are achieved by updating the eigenvalue decomposition (EVD) of a reduced rank version of the data covariance matrix, and that reduced rank updating yields a lower threshold breakdown than full rank updating. We also show that previously published eigenvector updating algorithms [1], [10], suffer from a linear build-up of roundoff error which becomes significant when large numbers of recursive updates are performed. We then show that exponential weighting together with pairwise Gram Schmidt partial orthogonalization at each update virtually eliminates the build-up of error making the rank-one update a useful numerical tool for recursive updating. Finally, we compare the frequency estimation performance of reduced rank weighted linear prediction and the LMS algorithm.","PeriodicalId":140810,"journal":{"name":"ICASSP '87. IEEE International Conference on Acoustics, Speech, and Signal Processing","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1987-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"An improved, highly parallel rank-one eigenvector update method with signal processing applications\",\"authors\":\"R. DeGroat, R. Roberts\",\"doi\":\"10.1109/ICASSP.1987.1169500\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we discuss rank-one eigenvector updating schemes that are appropriate for tracking time-varying, narrow-band signals in noise. We show that significant reductions in computation are achieved by updating the eigenvalue decomposition (EVD) of a reduced rank version of the data covariance matrix, and that reduced rank updating yields a lower threshold breakdown than full rank updating. We also show that previously published eigenvector updating algorithms [1], [10], suffer from a linear build-up of roundoff error which becomes significant when large numbers of recursive updates are performed. We then show that exponential weighting together with pairwise Gram Schmidt partial orthogonalization at each update virtually eliminates the build-up of error making the rank-one update a useful numerical tool for recursive updating. Finally, we compare the frequency estimation performance of reduced rank weighted linear prediction and the LMS algorithm.\",\"PeriodicalId\":140810,\"journal\":{\"name\":\"ICASSP '87. IEEE International Conference on Acoustics, Speech, and Signal Processing\",\"volume\":\"21 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1987-04-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ICASSP '87. IEEE International Conference on Acoustics, Speech, and Signal Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICASSP.1987.1169500\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICASSP '87. IEEE International Conference on Acoustics, Speech, and Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP.1987.1169500","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

在本文中,我们讨论了适合于跟踪时变窄带噪声信号的秩一特征向量更新方案。我们表明,通过更新数据协方差矩阵的降阶版本的特征值分解(EVD)可以显著减少计算量,并且降阶更新产生比全秩更新更低的阈值分解。我们还表明,先前发布的特征向量更新算法[1],[10]遭受舍入误差的线性累积,当执行大量递归更新时,这种误差变得显着。然后,我们展示了指数加权和两两Gram Schmidt部分正交化在每次更新时实际上消除了误差的积累,使排名一更新成为递归更新的有用数值工具。最后,比较了降阶加权线性预测和LMS算法的频率估计性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
An improved, highly parallel rank-one eigenvector update method with signal processing applications
In this paper, we discuss rank-one eigenvector updating schemes that are appropriate for tracking time-varying, narrow-band signals in noise. We show that significant reductions in computation are achieved by updating the eigenvalue decomposition (EVD) of a reduced rank version of the data covariance matrix, and that reduced rank updating yields a lower threshold breakdown than full rank updating. We also show that previously published eigenvector updating algorithms [1], [10], suffer from a linear build-up of roundoff error which becomes significant when large numbers of recursive updates are performed. We then show that exponential weighting together with pairwise Gram Schmidt partial orthogonalization at each update virtually eliminates the build-up of error making the rank-one update a useful numerical tool for recursive updating. Finally, we compare the frequency estimation performance of reduced rank weighted linear prediction and the LMS algorithm.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A high resolution data-adaptive time-frequency representation A fast prediction-error detector for estimating sparse-spike sequences Some applications of mathematical morphology to range imagery Parameter estimation using the autocorrelation of the discrete Fourier transform Array signal processing with interconnected Neuron-like elements
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1