{"title":"一种求神经网络误差函数全局最小值的权重进化算法","authors":"S. Ng, S. Leung","doi":"10.1109/CEC.2000.870289","DOIUrl":null,"url":null,"abstract":"This paper introduces a new weight evolution algorithm to find the global minimum of the error function in a multi-layered neural network. During the learning phase of backpropagation, the network weights are adjusted intentionally in order to have an improvement in system performance. By looking at the system outputs of the nodes, it is possible to adjust some of the network weights deterministically so as to achieve an overall reduction in system error. The idea is to work backward from the error components and the system outputs to deduce a deterministic perturbation on particular network weights for optimization purposes. Using the new algorithm, it is found that the weight evolution between the hidden and output layer can accelerate the convergence speed, whereas the weight evolution between the input layer and the hidden layer can assist in solving the local minima problem.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A weight evolution algorithm for finding the global minimum of error function in neural networks\",\"authors\":\"S. Ng, S. Leung\",\"doi\":\"10.1109/CEC.2000.870289\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper introduces a new weight evolution algorithm to find the global minimum of the error function in a multi-layered neural network. During the learning phase of backpropagation, the network weights are adjusted intentionally in order to have an improvement in system performance. By looking at the system outputs of the nodes, it is possible to adjust some of the network weights deterministically so as to achieve an overall reduction in system error. The idea is to work backward from the error components and the system outputs to deduce a deterministic perturbation on particular network weights for optimization purposes. Using the new algorithm, it is found that the weight evolution between the hidden and output layer can accelerate the convergence speed, whereas the weight evolution between the input layer and the hidden layer can assist in solving the local minima problem.\",\"PeriodicalId\":218136,\"journal\":{\"name\":\"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2000-07-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CEC.2000.870289\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2000.870289","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A weight evolution algorithm for finding the global minimum of error function in neural networks
This paper introduces a new weight evolution algorithm to find the global minimum of the error function in a multi-layered neural network. During the learning phase of backpropagation, the network weights are adjusted intentionally in order to have an improvement in system performance. By looking at the system outputs of the nodes, it is possible to adjust some of the network weights deterministically so as to achieve an overall reduction in system error. The idea is to work backward from the error components and the system outputs to deduce a deterministic perturbation on particular network weights for optimization purposes. Using the new algorithm, it is found that the weight evolution between the hidden and output layer can accelerate the convergence speed, whereas the weight evolution between the input layer and the hidden layer can assist in solving the local minima problem.