{"title":"Interval based Weight Initialization Method for Sigmoidal Feedforward Artificial Neural Networks","authors":"Sartaj Singh Sodhi, Pravin Chandra","doi":"10.1016/j.aasri.2014.05.004","DOIUrl":null,"url":null,"abstract":"<div><p>Initial weight choice is an important aspect of the training mechanism for sigmoidal feedforward artificial neural networks. Usually weights are initialized to small random values in the same interval. A proposal is made in the paper to initialize weights such that the input layer to the hidden layer weights are initialized to random values in a manner that weights for distinct hidden nodes belong to distinct intervals. The training algorithm used in the paper is the Resilient Backpropagation algorithm. The efficiency and efficacy of the proposed weight initialization method is demonstrated on 6 function approximation tasks. The obtained results indicate that when the networks are initialized by the proposed method, the networks can reach deeper minimum of the error functional during training, generalize better (have lesser error on data that is not used for training) and are faster in convergence as compared to the usual random weight initialization method.</p></div>","PeriodicalId":100008,"journal":{"name":"AASRI Procedia","volume":"6 ","pages":"Pages 19-25"},"PeriodicalIF":0.0000,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.aasri.2014.05.004","citationCount":"29","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AASRI Procedia","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2212671614000055","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 29
Abstract
Initial weight choice is an important aspect of the training mechanism for sigmoidal feedforward artificial neural networks. Usually weights are initialized to small random values in the same interval. A proposal is made in the paper to initialize weights such that the input layer to the hidden layer weights are initialized to random values in a manner that weights for distinct hidden nodes belong to distinct intervals. The training algorithm used in the paper is the Resilient Backpropagation algorithm. The efficiency and efficacy of the proposed weight initialization method is demonstrated on 6 function approximation tasks. The obtained results indicate that when the networks are initialized by the proposed method, the networks can reach deeper minimum of the error functional during training, generalize better (have lesser error on data that is not used for training) and are faster in convergence as compared to the usual random weight initialization method.