{"title":"激活隐藏连接加速递归神经网络的学习","authors":"R. Kamimura","doi":"10.1109/IJCNN.1992.287106","DOIUrl":null,"url":null,"abstract":"A method of accelerating the learning in recurrent neural networks is considered. Owing to a possible large number of connections, it has been expected that recurrent neural networks will converge faster. To activate hidden connections and use hidden units efficiently, a complexity term proposed by D.E. Rumelhart was added to the standard quadratic error function. A complexity term method is modified with a parameter to be normally effective for positive values, while negative values are pushed toward values with larger absolute values. Thus, some hidden connections are expected to be large enough to use hidden units and to speed up the learning. From the author's experiments, it was confirmed that the complexity term was effective in increasing the variance of connections, especially hidden connections, and that eventually some hidden connections were activated and large enough for hidden units to be used in speeding up the learning.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Activated hidden connections to accelerate the learning in recurrent neural networks\",\"authors\":\"R. Kamimura\",\"doi\":\"10.1109/IJCNN.1992.287106\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A method of accelerating the learning in recurrent neural networks is considered. Owing to a possible large number of connections, it has been expected that recurrent neural networks will converge faster. To activate hidden connections and use hidden units efficiently, a complexity term proposed by D.E. Rumelhart was added to the standard quadratic error function. A complexity term method is modified with a parameter to be normally effective for positive values, while negative values are pushed toward values with larger absolute values. Thus, some hidden connections are expected to be large enough to use hidden units and to speed up the learning. From the author's experiments, it was confirmed that the complexity term was effective in increasing the variance of connections, especially hidden connections, and that eventually some hidden connections were activated and large enough for hidden units to be used in speeding up the learning.<<ETX>>\",\"PeriodicalId\":286849,\"journal\":{\"name\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1992-06-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1992.287106\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1992.287106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Activated hidden connections to accelerate the learning in recurrent neural networks
A method of accelerating the learning in recurrent neural networks is considered. Owing to a possible large number of connections, it has been expected that recurrent neural networks will converge faster. To activate hidden connections and use hidden units efficiently, a complexity term proposed by D.E. Rumelhart was added to the standard quadratic error function. A complexity term method is modified with a parameter to be normally effective for positive values, while negative values are pushed toward values with larger absolute values. Thus, some hidden connections are expected to be large enough to use hidden units and to speed up the learning. From the author's experiments, it was confirmed that the complexity term was effective in increasing the variance of connections, especially hidden connections, and that eventually some hidden connections were activated and large enough for hidden units to be used in speeding up the learning.<>