{"title":"对标准反向传播算法变化的性能评估","authors":"P. Karkhanis, G. Bebis","doi":"10.1109/SOUTHC.1994.498078","DOIUrl":null,"url":null,"abstract":"A number of techniques have been proposed recently, which attempt to improve the generalization capabilities of backpropagation neural networks (BPNNs). Among them, weight-decay, cross-validation, and weight-smoothing are probably the most simple and the most frequently used. This paper presents an empirical performance comparison among the above approaches using two real world databases. In addition, in order to further improve generalization, a combination of all the above approaches has been considered and tested. Experimental results illustrate that the coupling of all the three approaches together, significantly outperforms each other individual approach.","PeriodicalId":164672,"journal":{"name":"Conference Record Southcon","volume":"30 4 Pt 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A performance evaluation of variations to the standard back-propagation algorithm\",\"authors\":\"P. Karkhanis, G. Bebis\",\"doi\":\"10.1109/SOUTHC.1994.498078\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A number of techniques have been proposed recently, which attempt to improve the generalization capabilities of backpropagation neural networks (BPNNs). Among them, weight-decay, cross-validation, and weight-smoothing are probably the most simple and the most frequently used. This paper presents an empirical performance comparison among the above approaches using two real world databases. In addition, in order to further improve generalization, a combination of all the above approaches has been considered and tested. Experimental results illustrate that the coupling of all the three approaches together, significantly outperforms each other individual approach.\",\"PeriodicalId\":164672,\"journal\":{\"name\":\"Conference Record Southcon\",\"volume\":\"30 4 Pt 2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-03-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Conference Record Southcon\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SOUTHC.1994.498078\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference Record Southcon","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SOUTHC.1994.498078","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A performance evaluation of variations to the standard back-propagation algorithm
A number of techniques have been proposed recently, which attempt to improve the generalization capabilities of backpropagation neural networks (BPNNs). Among them, weight-decay, cross-validation, and weight-smoothing are probably the most simple and the most frequently used. This paper presents an empirical performance comparison among the above approaches using two real world databases. In addition, in order to further improve generalization, a combination of all the above approaches has been considered and tested. Experimental results illustrate that the coupling of all the three approaches together, significantly outperforms each other individual approach.