{"title":"A non-sigmoidal activation function for feedforward artificial neural networks","authors":"Pravin Chandra, Udayan Ghose, A. Sood","doi":"10.1109/IJCNN.2015.7280440","DOIUrl":null,"url":null,"abstract":"For a single hidden layer feedforward artificial neural network to possess the universal approximation property, it is sufficient that the hidden layer nodes activation functions are continuous non-polynomial function. It is not required that the activation function be a sigmoidal function. In this paper a simple continuous, bounded, non-constant, differentiable, non-sigmoid and non-polynomial function is proposed, for usage as the activation function at hidden layer nodes. The proposed activation function does require the computation of an exponential function, and thus is computationally less intensive as compared to either the log-sigmoid or the hyperbolic tangent function. On a set of 10 function approximation tasks we demonstrate the efficiency and efficacy of the usage of the proposed activation functions. The results obtained allow us to assert that, at least on the 10 function approximation tasks, the results demonstrate that in equal epochs of training, the networks using the proposed activation function reach deeper minima of the error functional and also generalize better in most of the cases, and statistically are as good as if not better than networks using the logistic function as the activation function at the hidden nodes.","PeriodicalId":6539,"journal":{"name":"2015 International Joint Conference on Neural Networks (IJCNN)","volume":"5 1","pages":"1-8"},"PeriodicalIF":0.0000,"publicationDate":"2015-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2015.7280440","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
For a single hidden layer feedforward artificial neural network to possess the universal approximation property, it is sufficient that the hidden layer nodes activation functions are continuous non-polynomial function. It is not required that the activation function be a sigmoidal function. In this paper a simple continuous, bounded, non-constant, differentiable, non-sigmoid and non-polynomial function is proposed, for usage as the activation function at hidden layer nodes. The proposed activation function does require the computation of an exponential function, and thus is computationally less intensive as compared to either the log-sigmoid or the hyperbolic tangent function. On a set of 10 function approximation tasks we demonstrate the efficiency and efficacy of the usage of the proposed activation functions. The results obtained allow us to assert that, at least on the 10 function approximation tasks, the results demonstrate that in equal epochs of training, the networks using the proposed activation function reach deeper minima of the error functional and also generalize better in most of the cases, and statistically are as good as if not better than networks using the logistic function as the activation function at the hidden nodes.