Johannes Viehweg, Dominik Walther, Prof. Dr. -Ing. Patrick Mäder
{"title":"时间卷积衍生多层储层计算","authors":"Johannes Viehweg, Dominik Walther, Prof. Dr. -Ing. Patrick Mäder","doi":"arxiv-2407.06771","DOIUrl":null,"url":null,"abstract":"The prediction of time series is a challenging task relevant in such diverse\napplications as analyzing financial data, forecasting flow dynamics or\nunderstanding biological processes. Especially chaotic time series that depend\non a long history pose an exceptionally difficult problem. While machine\nlearning has shown to be a promising approach for predicting such time series,\nit either demands long training time and much training data when using deep\nrecurrent neural networks. Alternative, when using a reservoir computing\napproach it comes with high uncertainty and typically a high number of random\ninitializations and extensive hyper-parameter tuning when using a reservoir\ncomputing approach. In this paper, we focus on the reservoir computing approach\nand propose a new mapping of input data into the reservoir's state space.\nFurthermore, we incorporate this method in two novel network architectures\nincreasing parallelizability, depth and predictive capabilities of the neural\nnetwork while reducing the dependence on randomness. For the evaluation, we\napproximate a set of time series from the Mackey-Glass equation, inhabiting\nnon-chaotic as well as chaotic behavior and compare our approaches in regard to\ntheir predictive capabilities to echo state networks and gated recurrent units.\nFor the chaotic time series, we observe an error reduction of up to $85.45\\%$\nand up to $87.90\\%$ in contrast to echo state networks and gated recurrent\nunits respectively. Furthermore, we also observe tremendous improvements for\nnon-chaotic time series of up to $99.99\\%$ in contrast to existing approaches.","PeriodicalId":501167,"journal":{"name":"arXiv - PHYS - Chaotic Dynamics","volume":"17 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Temporal Convolution Derived Multi-Layered Reservoir Computing\",\"authors\":\"Johannes Viehweg, Dominik Walther, Prof. Dr. -Ing. Patrick Mäder\",\"doi\":\"arxiv-2407.06771\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The prediction of time series is a challenging task relevant in such diverse\\napplications as analyzing financial data, forecasting flow dynamics or\\nunderstanding biological processes. Especially chaotic time series that depend\\non a long history pose an exceptionally difficult problem. While machine\\nlearning has shown to be a promising approach for predicting such time series,\\nit either demands long training time and much training data when using deep\\nrecurrent neural networks. Alternative, when using a reservoir computing\\napproach it comes with high uncertainty and typically a high number of random\\ninitializations and extensive hyper-parameter tuning when using a reservoir\\ncomputing approach. In this paper, we focus on the reservoir computing approach\\nand propose a new mapping of input data into the reservoir's state space.\\nFurthermore, we incorporate this method in two novel network architectures\\nincreasing parallelizability, depth and predictive capabilities of the neural\\nnetwork while reducing the dependence on randomness. For the evaluation, we\\napproximate a set of time series from the Mackey-Glass equation, inhabiting\\nnon-chaotic as well as chaotic behavior and compare our approaches in regard to\\ntheir predictive capabilities to echo state networks and gated recurrent units.\\nFor the chaotic time series, we observe an error reduction of up to $85.45\\\\%$\\nand up to $87.90\\\\%$ in contrast to echo state networks and gated recurrent\\nunits respectively. Furthermore, we also observe tremendous improvements for\\nnon-chaotic time series of up to $99.99\\\\%$ in contrast to existing approaches.\",\"PeriodicalId\":501167,\"journal\":{\"name\":\"arXiv - PHYS - Chaotic Dynamics\",\"volume\":\"17 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Chaotic Dynamics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2407.06771\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Chaotic Dynamics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.06771","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The prediction of time series is a challenging task relevant in such diverse
applications as analyzing financial data, forecasting flow dynamics or
understanding biological processes. Especially chaotic time series that depend
on a long history pose an exceptionally difficult problem. While machine
learning has shown to be a promising approach for predicting such time series,
it either demands long training time and much training data when using deep
recurrent neural networks. Alternative, when using a reservoir computing
approach it comes with high uncertainty and typically a high number of random
initializations and extensive hyper-parameter tuning when using a reservoir
computing approach. In this paper, we focus on the reservoir computing approach
and propose a new mapping of input data into the reservoir's state space.
Furthermore, we incorporate this method in two novel network architectures
increasing parallelizability, depth and predictive capabilities of the neural
network while reducing the dependence on randomness. For the evaluation, we
approximate a set of time series from the Mackey-Glass equation, inhabiting
non-chaotic as well as chaotic behavior and compare our approaches in regard to
their predictive capabilities to echo state networks and gated recurrent units.
For the chaotic time series, we observe an error reduction of up to $85.45\%$
and up to $87.90\%$ in contrast to echo state networks and gated recurrent
units respectively. Furthermore, we also observe tremendous improvements for
non-chaotic time series of up to $99.99\%$ in contrast to existing approaches.