Ricardo Xavier Llugsi Cañar, S. Yacoubi, Allyx Fontaine, P. Lupera
{"title":"Comparison between Adam, AdaMax and Adam W optimizers to implement a Weather Forecast based on Neural Networks for the Andean city of Quito","authors":"Ricardo Xavier Llugsi Cañar, S. Yacoubi, Allyx Fontaine, P. Lupera","doi":"10.1109/ETCM53643.2021.9590681","DOIUrl":null,"url":null,"abstract":"The main function of an optimizer is to determine in what measure to change the weights and the learning rate of the neural network to reduce losses. One of the best known optimizers is Adam, which main advantage is the invariance of the magnitudes of the parameter updates with respect to the change of scale of the gradient. However, other optimizers are often chosen because they generalize in a better manner. AdamW is a variant of Adam where the weight decay is performed only after controlling the parameter-wise step size. In order to present a comparative scenario for optimizers in the present work, a Temperature Forecast for the Andean city of Quito using a neural network structure with uncertainty reduction was implemented and three optimizers (Adam, AdaMax and AdamW) were analyzed. In order to do the comparison three error metrics were obtained per hour in order to determine the effectiveness of the prediction. From the analysis it can be seen that Adam and AdaMax behave similarly reaching a maximum MSE per hour of 2.5°C nevertheless AdamW allows to reduce this error around 1.3°C.","PeriodicalId":438567,"journal":{"name":"2021 IEEE Fifth Ecuador Technical Chapters Meeting (ETCM)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Fifth Ecuador Technical Chapters Meeting (ETCM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ETCM53643.2021.9590681","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25
Abstract
The main function of an optimizer is to determine in what measure to change the weights and the learning rate of the neural network to reduce losses. One of the best known optimizers is Adam, which main advantage is the invariance of the magnitudes of the parameter updates with respect to the change of scale of the gradient. However, other optimizers are often chosen because they generalize in a better manner. AdamW is a variant of Adam where the weight decay is performed only after controlling the parameter-wise step size. In order to present a comparative scenario for optimizers in the present work, a Temperature Forecast for the Andean city of Quito using a neural network structure with uncertainty reduction was implemented and three optimizers (Adam, AdaMax and AdamW) were analyzed. In order to do the comparison three error metrics were obtained per hour in order to determine the effectiveness of the prediction. From the analysis it can be seen that Adam and AdaMax behave similarly reaching a maximum MSE per hour of 2.5°C nevertheless AdamW allows to reduce this error around 1.3°C.