Eden P. da Silva, C. Estombelo-Montesco, E. Santana
{"title":"KSIG: Improving the Convergence Rate in Adaptive Filtering Using Kernel Hilbert Space","authors":"Eden P. da Silva, C. Estombelo-Montesco, E. Santana","doi":"10.1109/BRACIS.2015.54","DOIUrl":null,"url":null,"abstract":"Machine learning algorithms are used in many areas, in signal processing, the adaptive filtering has been used in many jobs as smooth, prediction, equalization, etc. The Least Mean Square (LMS) algorithm is a successful example of this approach, this algorithm takes the instantaneous gradient of the cost function in his learning process. Nevertheless, recent works have proposed improvements on adaptive filtering based in LMS. On this context, Sigmoid Algorithm changes the LMS cost function, the Mean Square Error, to an even error function, which improves the convergence rate on the learning process. On a more complex approach, the kernel LMS taking the filtering problem in a high dimensional Hilbert space generated for a kernel function, where the desired filter output is the result of algebraic operations in that kernel generated space, which resulted on a decrease of the error compared to LMS. In face of this two improvements, this paper describes our work propose, the kernel version of Sigmoid Algorithm, whose results showed a decrease in the convergence rate on the learning process compared to kernel LMS.","PeriodicalId":416771,"journal":{"name":"2015 Brazilian Conference on Intelligent Systems (BRACIS)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 Brazilian Conference on Intelligent Systems (BRACIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BRACIS.2015.54","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Machine learning algorithms are used in many areas, in signal processing, the adaptive filtering has been used in many jobs as smooth, prediction, equalization, etc. The Least Mean Square (LMS) algorithm is a successful example of this approach, this algorithm takes the instantaneous gradient of the cost function in his learning process. Nevertheless, recent works have proposed improvements on adaptive filtering based in LMS. On this context, Sigmoid Algorithm changes the LMS cost function, the Mean Square Error, to an even error function, which improves the convergence rate on the learning process. On a more complex approach, the kernel LMS taking the filtering problem in a high dimensional Hilbert space generated for a kernel function, where the desired filter output is the result of algebraic operations in that kernel generated space, which resulted on a decrease of the error compared to LMS. In face of this two improvements, this paper describes our work propose, the kernel version of Sigmoid Algorithm, whose results showed a decrease in the convergence rate on the learning process compared to kernel LMS.