{"title":"Modified backpropagation algorithm with adaptive learning rate based on differential errors and differential functional constraints","authors":"T. Kathirvalavakumar, S. J. Subavathi","doi":"10.1109/ICPRIME.2012.6208288","DOIUrl":null,"url":null,"abstract":"In this paper, a new adaptive learning rate algorithm to train a single hidden layer neural network is proposed. The adaptive learning rate is derived by differentiating linear and nonlinear errors and functional constraints weight decay term at hidden layer and penalty term at output layer. Since the adaptive learning rate calculation involves first order derivative of linear and nonlinear errors and second order derivatives of functional constraints, the proposed algorithm converges quickly. Simulation results show the advantages of proposed algorithm.","PeriodicalId":148511,"journal":{"name":"International Conference on Pattern Recognition, Informatics and Medical Engineering (PRIME-2012)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Pattern Recognition, Informatics and Medical Engineering (PRIME-2012)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPRIME.2012.6208288","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
In this paper, a new adaptive learning rate algorithm to train a single hidden layer neural network is proposed. The adaptive learning rate is derived by differentiating linear and nonlinear errors and functional constraints weight decay term at hidden layer and penalty term at output layer. Since the adaptive learning rate calculation involves first order derivative of linear and nonlinear errors and second order derivatives of functional constraints, the proposed algorithm converges quickly. Simulation results show the advantages of proposed algorithm.