{"title":"Learning of Neural Networks Based on Weighted Mean Squares Error Function","authors":"Sai Yang, Jinxia Ren, Zhongxia Li","doi":"10.1109/ISCID.2009.67","DOIUrl":null,"url":null,"abstract":"In weighted mean squares error (WMSE) function, each sample error multiplies a weighting coefficient, then it can make noise error have a smaller proportion in the cost function, even the outliers can’t affect the learning of the neural networks by tuning the smooth parameter , which enhances the anti-noise ability of neural networks. If the samples don’t have noise samples, weighted mean squares error function can make neural networks avoid over-fitting. When the neural networks are linear models, the new cost function turns into a realization of weighted least squares method, the simulation results show the advantages and application conditions of the weighted squares error function.","PeriodicalId":294370,"journal":{"name":"International Symposium on Computational Intelligence and Design","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Computational Intelligence and Design","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCID.2009.67","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In weighted mean squares error (WMSE) function, each sample error multiplies a weighting coefficient, then it can make noise error have a smaller proportion in the cost function, even the outliers can’t affect the learning of the neural networks by tuning the smooth parameter , which enhances the anti-noise ability of neural networks. If the samples don’t have noise samples, weighted mean squares error function can make neural networks avoid over-fitting. When the neural networks are linear models, the new cost function turns into a realization of weighted least squares method, the simulation results show the advantages and application conditions of the weighted squares error function.