Mathematical Analysis and Performance Evaluation of the GELU Activation Function in Deep Learning

IF 0.7 Q2 MATHEMATICS Muenster Journal of Mathematics Pub Date : 2023-08-10 DOI:10.1155/2023/4229924
Minhyeok Lee
{"title":"Mathematical Analysis and Performance Evaluation of the GELU Activation Function in Deep Learning","authors":"Minhyeok Lee","doi":"10.1155/2023/4229924","DOIUrl":null,"url":null,"abstract":"Selecting the most suitable activation function is a critical factor in the effectiveness of deep learning models, as it influences their learning capacity, stability, and computational efficiency. In recent years, the Gaussian error linear unit (GELU) activation function has emerged as a dominant method, surpassing traditional functions such as the rectified linear unit (ReLU) in various applications. This study presents a rigorous mathematical investigation of the GELU activation function, exploring its differentiability, boundedness, stationarity, and smoothness properties in detail. In addition, we conduct an extensive experimental comparison of the GELU function against a broad range of alternative activation functions, utilizing a residual convolutional network trained on the CIFAR-10, CIFAR-100, and STL-10 datasets as the empirical testbed. Our results demonstrate the superior performance of GELU compared to other activation functions, establishing its suitability for a wide range of deep learning applications. This comprehensive study contributes to a more profound understanding of the underlying mathematical properties of GELU and provides valuable insights for practitioners aiming to select activation functions that optimally align with their specific objectives and constraints in deep learning.","PeriodicalId":43667,"journal":{"name":"Muenster Journal of Mathematics","volume":null,"pages":null},"PeriodicalIF":0.7000,"publicationDate":"2023-08-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Muenster Journal of Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1155/2023/4229924","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 1

Abstract

Selecting the most suitable activation function is a critical factor in the effectiveness of deep learning models, as it influences their learning capacity, stability, and computational efficiency. In recent years, the Gaussian error linear unit (GELU) activation function has emerged as a dominant method, surpassing traditional functions such as the rectified linear unit (ReLU) in various applications. This study presents a rigorous mathematical investigation of the GELU activation function, exploring its differentiability, boundedness, stationarity, and smoothness properties in detail. In addition, we conduct an extensive experimental comparison of the GELU function against a broad range of alternative activation functions, utilizing a residual convolutional network trained on the CIFAR-10, CIFAR-100, and STL-10 datasets as the empirical testbed. Our results demonstrate the superior performance of GELU compared to other activation functions, establishing its suitability for a wide range of deep learning applications. This comprehensive study contributes to a more profound understanding of the underlying mathematical properties of GELU and provides valuable insights for practitioners aiming to select activation functions that optimally align with their specific objectives and constraints in deep learning.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
深度学习中GELU激活函数的数学分析与性能评价
选择最合适的激活函数是影响深度学习模型有效性的关键因素,因为它会影响模型的学习能力、稳定性和计算效率。近年来,高斯误差线性单元(GELU)激活函数在各种应用中已经超越了整流线性单元(ReLU)等传统函数,成为一种主导方法。本研究对GELU激活函数进行了严格的数学研究,详细探讨了其可微性、有界性、平稳性和光滑性。此外,我们对GELU函数与广泛的替代激活函数进行了广泛的实验比较,利用在CIFAR-10, CIFAR-100和STL-10数据集上训练的残差卷积网络作为经验测试平台。我们的研究结果表明,与其他激活函数相比,GELU具有优越的性能,从而确定了其适用于广泛的深度学习应用。这项全面的研究有助于更深刻地理解GELU的潜在数学特性,并为旨在选择与深度学习中的特定目标和约束最佳一致的激活函数的从业者提供有价值的见解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
System Level Extropy of the Past Life of a Coherent System A New Proof of Rational Cycles for Collatz-Like Functions Using a Coprime Condition Adaptive Hierarchical Collocation Method for Solving Fractional Population Diffusion Model The Approximation of Generalized Log-Aesthetic Curves with G Weighted Extropy for Concomitants of Upper k-Record Values Based on Huang–Kotz Morgenstern of Bivariate Distribution
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1