{"title":"GEPAF: A non-monotonic generalized activation function in neural network for improving prediction with diverse data distributions characteristics","authors":"","doi":"10.1016/j.neunet.2024.106738","DOIUrl":null,"url":null,"abstract":"<div><p>The world today has made prescriptive analytics that uses data-driven insights to guide future actions. The distribution of data, however, differs depending on the scenario, making it difficult to interpret and comprehend the data efficiently. Different neural network models are used to solve this, taking inspiration from the complex network architecture in the human brain. The activation function is crucial in introducing non-linearity to process data gradients effectively. Although popular activation functions such as ReLU, Sigmoid, Swish, and Tanh have advantages and disadvantages, they may struggle to adapt to diverse data characteristics. A generalized activation function named the Generalized Exponential Parametric Activation Function (GEPAF) is proposed to address this issue. This function consists of three parameters expressed: <span><math><mi>α</mi></math></span>, which stands for a differencing factor similar to the mean; <span><math><mi>σ</mi></math></span>, which stands for a variance to control distribution spread; and <span><math><mi>p</mi></math></span>, which is a power factor that improves flexibility; all these parameters are present in the exponent. When <span><math><mrow><mi>p</mi><mo>=</mo><mn>2</mn></mrow></math></span>, the activation function resembles a Gaussian function. Initially, this paper describes the mathematical derivation and validation of the properties of this function mathematically and graphically. After this, the GEPAF function is practically implemented in real-world supply chain datasets. One dataset features a small sample size but exhibits high variance, while the other shows significant variance with a moderate amount of data. An LSTM network processes the dataset for sales and profit prediction. The suggested function performs better than popular activation functions when a comparative analysis of the activation function is performed, showing at least 30% improvement in regression evaluation metrics and better loss decay characteristics.</p></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":null,"pages":null},"PeriodicalIF":6.0000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024006622","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The world today has made prescriptive analytics that uses data-driven insights to guide future actions. The distribution of data, however, differs depending on the scenario, making it difficult to interpret and comprehend the data efficiently. Different neural network models are used to solve this, taking inspiration from the complex network architecture in the human brain. The activation function is crucial in introducing non-linearity to process data gradients effectively. Although popular activation functions such as ReLU, Sigmoid, Swish, and Tanh have advantages and disadvantages, they may struggle to adapt to diverse data characteristics. A generalized activation function named the Generalized Exponential Parametric Activation Function (GEPAF) is proposed to address this issue. This function consists of three parameters expressed: , which stands for a differencing factor similar to the mean; , which stands for a variance to control distribution spread; and , which is a power factor that improves flexibility; all these parameters are present in the exponent. When , the activation function resembles a Gaussian function. Initially, this paper describes the mathematical derivation and validation of the properties of this function mathematically and graphically. After this, the GEPAF function is practically implemented in real-world supply chain datasets. One dataset features a small sample size but exhibits high variance, while the other shows significant variance with a moderate amount of data. An LSTM network processes the dataset for sales and profit prediction. The suggested function performs better than popular activation functions when a comparative analysis of the activation function is performed, showing at least 30% improvement in regression evaluation metrics and better loss decay characteristics.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.