Zhenxing Xia , Wei Dai , Xin Liu , Haijun Zhang , Xiaoping Ma
{"title":"建设性神经网络的数据特定激活函数学习","authors":"Zhenxing Xia , Wei Dai , Xin Liu , Haijun Zhang , Xiaoping Ma","doi":"10.1016/j.neucom.2024.129020","DOIUrl":null,"url":null,"abstract":"<div><div>Activation functions play a crucial role in learning and expressive capabilities of advanced neural networks due to their non-linear or non-saturated properties. However, how to determine the appropriate activation function from various candidates is a challenging yet not well-addressed topic. To address the issue, a novel self-learning approach, called as data-specific activation function learning (DSAFL) algorithm, is proposed to establish constructive neural network on one-time by adaptively selecting appropriate activation function based on the specific data characteristics. To assess the space dimension mapping abilities of different activation functions, the configuration probabilities are used to guide the generation of various candidate activation functions and corresponding candidate hidden node. In the learning stage, an exploration-exploitation mechanism composed of the random algorithm and the greedy strategy is developed to obtain the influence of different candidate activation functions, thereby avoiding configuration probabilities falling into local optimum. A reward-penalty mechanism is built to update the configuration probabilities and enhance the robustness of network by integrating the simulated annealing strategy. In final, the activation function with the highest configuration probability, as the best one, is used to reconstruct the neural network. Experimental results on both regression and classification tasks demonstrate the efficiency and effectiveness of DSAFL in the activation function selection problems of a class of constructive neural networks.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"617 ","pages":"Article 129020"},"PeriodicalIF":5.5000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Data-specific activation function learning for constructive neural networks\",\"authors\":\"Zhenxing Xia , Wei Dai , Xin Liu , Haijun Zhang , Xiaoping Ma\",\"doi\":\"10.1016/j.neucom.2024.129020\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Activation functions play a crucial role in learning and expressive capabilities of advanced neural networks due to their non-linear or non-saturated properties. However, how to determine the appropriate activation function from various candidates is a challenging yet not well-addressed topic. To address the issue, a novel self-learning approach, called as data-specific activation function learning (DSAFL) algorithm, is proposed to establish constructive neural network on one-time by adaptively selecting appropriate activation function based on the specific data characteristics. To assess the space dimension mapping abilities of different activation functions, the configuration probabilities are used to guide the generation of various candidate activation functions and corresponding candidate hidden node. In the learning stage, an exploration-exploitation mechanism composed of the random algorithm and the greedy strategy is developed to obtain the influence of different candidate activation functions, thereby avoiding configuration probabilities falling into local optimum. A reward-penalty mechanism is built to update the configuration probabilities and enhance the robustness of network by integrating the simulated annealing strategy. In final, the activation function with the highest configuration probability, as the best one, is used to reconstruct the neural network. Experimental results on both regression and classification tasks demonstrate the efficiency and effectiveness of DSAFL in the activation function selection problems of a class of constructive neural networks.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"617 \",\"pages\":\"Article 129020\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-11-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231224017910\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224017910","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
摘要
激活函数由于其非线性或非饱和的特性,在高级神经网络的学习和表达能力中起着至关重要的作用。然而,如何从各种候选中确定合适的激活函数是一个具有挑战性但尚未得到很好解决的话题。为了解决这一问题,提出了一种新的自学习方法——数据特定激活函数学习(data-specific activation function learning, DSAFL)算法,该算法根据特定的数据特征自适应选择合适的激活函数,一次性建立建设性神经网络。为了评估不同激活函数的空间维度映射能力,利用构型概率指导生成各种候选激活函数和相应的候选隐藏节点。在学习阶段,提出了一种由随机算法和贪婪策略组成的探索-开发机制,以获取不同候选激活函数的影响,从而避免构型概率陷入局部最优。结合模拟退火策略,建立奖惩机制,更新配置概率,增强网络的鲁棒性。最后,选取配置概率最高的激活函数作为最佳激活函数,对神经网络进行重构。在回归和分类任务上的实验结果都证明了DSAFL在一类构造性神经网络的激活函数选择问题中的效率和有效性。
Data-specific activation function learning for constructive neural networks
Activation functions play a crucial role in learning and expressive capabilities of advanced neural networks due to their non-linear or non-saturated properties. However, how to determine the appropriate activation function from various candidates is a challenging yet not well-addressed topic. To address the issue, a novel self-learning approach, called as data-specific activation function learning (DSAFL) algorithm, is proposed to establish constructive neural network on one-time by adaptively selecting appropriate activation function based on the specific data characteristics. To assess the space dimension mapping abilities of different activation functions, the configuration probabilities are used to guide the generation of various candidate activation functions and corresponding candidate hidden node. In the learning stage, an exploration-exploitation mechanism composed of the random algorithm and the greedy strategy is developed to obtain the influence of different candidate activation functions, thereby avoiding configuration probabilities falling into local optimum. A reward-penalty mechanism is built to update the configuration probabilities and enhance the robustness of network by integrating the simulated annealing strategy. In final, the activation function with the highest configuration probability, as the best one, is used to reconstruct the neural network. Experimental results on both regression and classification tasks demonstrate the efficiency and effectiveness of DSAFL in the activation function selection problems of a class of constructive neural networks.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.