{"title":"A new method in incremental neural network construction by using boosting","authors":"X. Wang, D. Brown, T. Haynes, T.M.J. Hui","doi":"10.1109/ISP.2003.1275834","DOIUrl":null,"url":null,"abstract":"A weighted optimisation method based on the AdaBoost algorithm is proposed and used in neural network incremental construction. Compared to the traditional gradient-based method, it has the advantage of being easy to implement and are applied where the cost function is not smooth. The experimental results are included.","PeriodicalId":285893,"journal":{"name":"IEEE International Symposium on Intelligent Signal Processing, 2003","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE International Symposium on Intelligent Signal Processing, 2003","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISP.2003.1275834","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
A weighted optimisation method based on the AdaBoost algorithm is proposed and used in neural network incremental construction. Compared to the traditional gradient-based method, it has the advantage of being easy to implement and are applied where the cost function is not smooth. The experimental results are included.