{"title":"残差网络在时间序列分类中的敏感性研究","authors":"Sahar Alwadei, Moataz A. Ahmed","doi":"10.1109/caida51941.2021.9425060","DOIUrl":null,"url":null,"abstract":"Time series classification (TCS) is an essential task in many applications. There have been different models proposed for TSC where deep learning models proved to be an excellent option. However, deep learning models' performance is generally known to be highly affected by the settings of their architectural design decisions and values of corresponding hyperparameters. In this research, we study the impact of such decisions and values on Residual Neural Networks (ResNets), a leading deep learning model for TSC. The study considered four factors to be investigated those are the model’s depth and width besides learning and dropout rates. The interplay between the characteristics of time series data and these factors has been looked at as well. A set of designed variants of the model was analyzed statistically, which led to recommend specific settings while building the model. Experimental results show that learning and dropout rates influence the model’s performance the most, while deeper and wider networks did not enhance the performance despite the extended cost of training.","PeriodicalId":272573,"journal":{"name":"2021 1st International Conference on Artificial Intelligence and Data Analytics (CAIDA)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On the Sensitivity of Residual Networks for Time Series Classification\",\"authors\":\"Sahar Alwadei, Moataz A. Ahmed\",\"doi\":\"10.1109/caida51941.2021.9425060\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Time series classification (TCS) is an essential task in many applications. There have been different models proposed for TSC where deep learning models proved to be an excellent option. However, deep learning models' performance is generally known to be highly affected by the settings of their architectural design decisions and values of corresponding hyperparameters. In this research, we study the impact of such decisions and values on Residual Neural Networks (ResNets), a leading deep learning model for TSC. The study considered four factors to be investigated those are the model’s depth and width besides learning and dropout rates. The interplay between the characteristics of time series data and these factors has been looked at as well. A set of designed variants of the model was analyzed statistically, which led to recommend specific settings while building the model. Experimental results show that learning and dropout rates influence the model’s performance the most, while deeper and wider networks did not enhance the performance despite the extended cost of training.\",\"PeriodicalId\":272573,\"journal\":{\"name\":\"2021 1st International Conference on Artificial Intelligence and Data Analytics (CAIDA)\",\"volume\":\"67 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-04-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 1st International Conference on Artificial Intelligence and Data Analytics (CAIDA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/caida51941.2021.9425060\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 1st International Conference on Artificial Intelligence and Data Analytics (CAIDA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/caida51941.2021.9425060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
On the Sensitivity of Residual Networks for Time Series Classification
Time series classification (TCS) is an essential task in many applications. There have been different models proposed for TSC where deep learning models proved to be an excellent option. However, deep learning models' performance is generally known to be highly affected by the settings of their architectural design decisions and values of corresponding hyperparameters. In this research, we study the impact of such decisions and values on Residual Neural Networks (ResNets), a leading deep learning model for TSC. The study considered four factors to be investigated those are the model’s depth and width besides learning and dropout rates. The interplay between the characteristics of time series data and these factors has been looked at as well. A set of designed variants of the model was analyzed statistically, which led to recommend specific settings while building the model. Experimental results show that learning and dropout rates influence the model’s performance the most, while deeper and wider networks did not enhance the performance despite the extended cost of training.