W. C. Siaw, S. L. Goh, A. I. Hanna, Christos Boukis, D. Mandic
{"title":"Fully adaptive neural nonlinear FIR filters","authors":"W. C. Siaw, S. L. Goh, A. I. Hanna, Christos Boukis, D. Mandic","doi":"10.1109/NNSP.2002.1030039","DOIUrl":null,"url":null,"abstract":"A class of algorithms for training neural adaptive filters employed for nonlinear adaptive filtering is introduced. Sign algorithms incorporated with the fully adaptive normalised nonlinear gradient descent (SFANNGD) algorithm, normalised nonlinear gradient descent (SNNGD) algorithm and nonlinear gradient descent (SNGD) algorithm are proposed. The SFANNGD, SNNGD and the SNGD are derived based upon the principle of the sign algorithm used in the least mean square (LMS) filters. Experiments on nonlinear signals confirm that SFANNGD, SNNGD and the SNGD algorithms perform on par as compared to their basic algorithms but the sign algorithm decreases the overall computational complexity of the adaptive filter algorithms.","PeriodicalId":117945,"journal":{"name":"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.2002.1030039","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
A class of algorithms for training neural adaptive filters employed for nonlinear adaptive filtering is introduced. Sign algorithms incorporated with the fully adaptive normalised nonlinear gradient descent (SFANNGD) algorithm, normalised nonlinear gradient descent (SNNGD) algorithm and nonlinear gradient descent (SNGD) algorithm are proposed. The SFANNGD, SNNGD and the SNGD are derived based upon the principle of the sign algorithm used in the least mean square (LMS) filters. Experiments on nonlinear signals confirm that SFANNGD, SNNGD and the SNGD algorithms perform on par as compared to their basic algorithms but the sign algorithm decreases the overall computational complexity of the adaptive filter algorithms.