{"title":"用于训练多层神经网络的Alopex算法","authors":"K. P. Venugopal, A. S. Pandya","doi":"10.1109/IJCNN.1991.170403","DOIUrl":null,"url":null,"abstract":"The use of the Alopex algorithm for training multilayer neural networks is described. Alopex is a biologically influenced stochastic parallel process designed to find the global minimum of error surfaces. It has a number of advantages compared to other algorithms, such as backpropagation, reinforcement learning, and the Boltzmann machine. The authors investigate the efficacy of the algorithm for faster convergence by considering different error functions. They discuss the specifics of the algorithm for applications involving learning tasks. Results of computer simulations with standard problems such as XOR, parity, symmetry, and encoders of different dimensions are also presented and compared with those obtained using backpropagation. A temperature perturbation scheme is proposed which allows the algorithm to get out of strong local minima.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"Alopex algorithm for training multilayer neural networks\",\"authors\":\"K. P. Venugopal, A. S. Pandya\",\"doi\":\"10.1109/IJCNN.1991.170403\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The use of the Alopex algorithm for training multilayer neural networks is described. Alopex is a biologically influenced stochastic parallel process designed to find the global minimum of error surfaces. It has a number of advantages compared to other algorithms, such as backpropagation, reinforcement learning, and the Boltzmann machine. The authors investigate the efficacy of the algorithm for faster convergence by considering different error functions. They discuss the specifics of the algorithm for applications involving learning tasks. Results of computer simulations with standard problems such as XOR, parity, symmetry, and encoders of different dimensions are also presented and compared with those obtained using backpropagation. A temperature perturbation scheme is proposed which allows the algorithm to get out of strong local minima.<<ETX>>\",\"PeriodicalId\":211135,\"journal\":{\"name\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-11-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1991.170403\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1991.170403","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Alopex algorithm for training multilayer neural networks
The use of the Alopex algorithm for training multilayer neural networks is described. Alopex is a biologically influenced stochastic parallel process designed to find the global minimum of error surfaces. It has a number of advantages compared to other algorithms, such as backpropagation, reinforcement learning, and the Boltzmann machine. The authors investigate the efficacy of the algorithm for faster convergence by considering different error functions. They discuss the specifics of the algorithm for applications involving learning tasks. Results of computer simulations with standard problems such as XOR, parity, symmetry, and encoders of different dimensions are also presented and compared with those obtained using backpropagation. A temperature perturbation scheme is proposed which allows the algorithm to get out of strong local minima.<>