{"title":"Assembly of Echo State Networks Driven by Segregated Low Dimensional Signals","authors":"T. Iinuma, S. Nobukawa, S. Yamaguchi","doi":"10.1109/IJCNN55064.2022.9892881","DOIUrl":null,"url":null,"abstract":"An echo state network (ESN), consisting of an input layer, reservoir, and output layer, provides a higher learning-efficient approach than other recurrent neural networks (RNNs). In the design of ESNs, a sufficiently large number of reservoir neurons is required compared to the dimension of the input signal. Thus, the number of neurons must be increased for high-dimensional input to achieve good performance. However, an increase in the number of neurons increases the computational load. To solve this problem, we propose an assembly ESN (AESN) architecture comprising a feature extraction part that uses multiple sub-ESNs with segregated components of high-dimensional input and a feature integration part. To validate the effectiveness of the proposed AESN, we investigated and compared the conventional ESN with the AESN under high-dimensional input. The results show that the AESN is possibly superior to the conventional ESN in accuracy, memory performance, and computational load. We believe that the AESN also has a correct integration function. Therefore, the proposed method is expected to solve high-dimensional problems with improved accuracy.","PeriodicalId":106974,"journal":{"name":"2022 International Joint Conference on Neural Networks (IJCNN)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN55064.2022.9892881","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
An echo state network (ESN), consisting of an input layer, reservoir, and output layer, provides a higher learning-efficient approach than other recurrent neural networks (RNNs). In the design of ESNs, a sufficiently large number of reservoir neurons is required compared to the dimension of the input signal. Thus, the number of neurons must be increased for high-dimensional input to achieve good performance. However, an increase in the number of neurons increases the computational load. To solve this problem, we propose an assembly ESN (AESN) architecture comprising a feature extraction part that uses multiple sub-ESNs with segregated components of high-dimensional input and a feature integration part. To validate the effectiveness of the proposed AESN, we investigated and compared the conventional ESN with the AESN under high-dimensional input. The results show that the AESN is possibly superior to the conventional ESN in accuracy, memory performance, and computational load. We believe that the AESN also has a correct integration function. Therefore, the proposed method is expected to solve high-dimensional problems with improved accuracy.