{"title":"人工神经网络中的同步学习与异步学习","authors":"J. Wang","doi":"10.1109/ICSYSE.1991.161109","DOIUrl":null,"url":null,"abstract":"Conditions of configuring feedforward neural networks without local minima are analyzed for both synchronous and asynchronous learning rules. Based on the analysis, a learning algorithm that integrates a synchronous-asynchronous learning rule with a dynamic configuration rule to train feedforward neural networks is presented. The theoretic analysis and numerical simulation reveal that the proposed learning algorithm substantially reduces the likelihood of local minimum solutions in supervised learning.<<ETX>>","PeriodicalId":250037,"journal":{"name":"IEEE 1991 International Conference on Systems Engineering","volume":"146 40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Synchronous learning versus asynchronous learning in artificial neural networks\",\"authors\":\"J. Wang\",\"doi\":\"10.1109/ICSYSE.1991.161109\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Conditions of configuring feedforward neural networks without local minima are analyzed for both synchronous and asynchronous learning rules. Based on the analysis, a learning algorithm that integrates a synchronous-asynchronous learning rule with a dynamic configuration rule to train feedforward neural networks is presented. The theoretic analysis and numerical simulation reveal that the proposed learning algorithm substantially reduces the likelihood of local minimum solutions in supervised learning.<<ETX>>\",\"PeriodicalId\":250037,\"journal\":{\"name\":\"IEEE 1991 International Conference on Systems Engineering\",\"volume\":\"146 40 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE 1991 International Conference on Systems Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICSYSE.1991.161109\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE 1991 International Conference on Systems Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSYSE.1991.161109","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Synchronous learning versus asynchronous learning in artificial neural networks
Conditions of configuring feedforward neural networks without local minima are analyzed for both synchronous and asynchronous learning rules. Based on the analysis, a learning algorithm that integrates a synchronous-asynchronous learning rule with a dynamic configuration rule to train feedforward neural networks is presented. The theoretic analysis and numerical simulation reveal that the proposed learning algorithm substantially reduces the likelihood of local minimum solutions in supervised learning.<>