Wav-KAN:小波 Kolmogorov-Arnold 网络

Zavareh Bozorgasl, Hao Chen
{"title":"Wav-KAN:小波 Kolmogorov-Arnold 网络","authors":"Zavareh Bozorgasl, Hao Chen","doi":"10.2139/ssrn.4835325","DOIUrl":null,"url":null,"abstract":"In this paper , we introduce Wav-KAN, an innovative neural network architecture that leverages the Wavelet Kolmogorov-Arnold Networks (Wav-KAN) framework to enhance interpretability and performance. Traditional multilayer perceptrons (MLPs) and even recent advancements like Spl-KAN face challenges related to interpretability, training speed, robustness, computational efficiency, and performance. Wav-KAN addresses these limitations by incorporating wavelet functions into the Kolmogorov-Arnold network structure, enabling the network to capture both high-frequency and low-frequency components of the input data efficiently. Wavelet-based approximations employ orthogonal or semi-orthogonal basis and also maintains a balance between accurately representing the underlying data structure and avoiding overfitting to the noise. Analogous to how water conforms to the shape of its container, Wav-KAN adapts to the data structure, resulting in enhanced accuracy, faster training speeds, and increased robustness compared to Spl-KAN and MLPs. Our results highlight the potential of Wav-KAN as a powerful tool for developing interpretable and high-performance neural networks, with applications spanning various fields. This work sets the stage for further exploration and implementation of Wav-KAN in frameworks such as PyTorch, TensorFlow, and also it makes wavelet in KAN in wide-spread usage like nowadays activation functions like ReLU, sigmoid in universal approximation theory (UAT).","PeriodicalId":507782,"journal":{"name":"SSRN Electronic Journal","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Wav-KAN: Wavelet Kolmogorov-Arnold Networks\",\"authors\":\"Zavareh Bozorgasl, Hao Chen\",\"doi\":\"10.2139/ssrn.4835325\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper , we introduce Wav-KAN, an innovative neural network architecture that leverages the Wavelet Kolmogorov-Arnold Networks (Wav-KAN) framework to enhance interpretability and performance. Traditional multilayer perceptrons (MLPs) and even recent advancements like Spl-KAN face challenges related to interpretability, training speed, robustness, computational efficiency, and performance. Wav-KAN addresses these limitations by incorporating wavelet functions into the Kolmogorov-Arnold network structure, enabling the network to capture both high-frequency and low-frequency components of the input data efficiently. Wavelet-based approximations employ orthogonal or semi-orthogonal basis and also maintains a balance between accurately representing the underlying data structure and avoiding overfitting to the noise. Analogous to how water conforms to the shape of its container, Wav-KAN adapts to the data structure, resulting in enhanced accuracy, faster training speeds, and increased robustness compared to Spl-KAN and MLPs. Our results highlight the potential of Wav-KAN as a powerful tool for developing interpretable and high-performance neural networks, with applications spanning various fields. This work sets the stage for further exploration and implementation of Wav-KAN in frameworks such as PyTorch, TensorFlow, and also it makes wavelet in KAN in wide-spread usage like nowadays activation functions like ReLU, sigmoid in universal approximation theory (UAT).\",\"PeriodicalId\":507782,\"journal\":{\"name\":\"SSRN Electronic Journal\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-05-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SSRN Electronic Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2139/ssrn.4835325\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SSRN Electronic Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.4835325","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在本文中,我们介绍了一种创新的神经网络架构 Wav-KAN,该架构利用小波柯尔莫哥洛夫-阿诺德网络(Wav-KAN)框架来提高可解释性和性能。传统的多层感知器(MLP),甚至是 Spl-KAN 等最新技术都面临着可解释性、训练速度、鲁棒性、计算效率和性能方面的挑战。Wav-KAN 通过将小波函数纳入 Kolmogorov-Arnold 网络结构来解决这些局限性,使网络能够有效捕捉输入数据的高频和低频成分。基于小波的近似方法采用正交或半正交基础,还能在准确表示底层数据结构和避免过度拟合噪声之间保持平衡。与水适应容器形状的原理类似,Wav-KAN 也能适应数据结构,因此与 Spl-KAN 和 MLP 相比,Wav-KAN 的准确性更高,训练速度更快,鲁棒性更强。我们的研究结果凸显了 Wav-KAN 作为开发可解释和高性能神经网络的强大工具的潜力,其应用领域横跨各个领域。这项工作为在 PyTorch、TensorFlow 等框架中进一步探索和实现 Wav-KAN 创造了条件,同时也使 KAN 中的小波得到了广泛应用,就像现在通用逼近理论(UAT)中的 ReLU、sigmoid 等激活函数一样。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Wav-KAN: Wavelet Kolmogorov-Arnold Networks
In this paper , we introduce Wav-KAN, an innovative neural network architecture that leverages the Wavelet Kolmogorov-Arnold Networks (Wav-KAN) framework to enhance interpretability and performance. Traditional multilayer perceptrons (MLPs) and even recent advancements like Spl-KAN face challenges related to interpretability, training speed, robustness, computational efficiency, and performance. Wav-KAN addresses these limitations by incorporating wavelet functions into the Kolmogorov-Arnold network structure, enabling the network to capture both high-frequency and low-frequency components of the input data efficiently. Wavelet-based approximations employ orthogonal or semi-orthogonal basis and also maintains a balance between accurately representing the underlying data structure and avoiding overfitting to the noise. Analogous to how water conforms to the shape of its container, Wav-KAN adapts to the data structure, resulting in enhanced accuracy, faster training speeds, and increased robustness compared to Spl-KAN and MLPs. Our results highlight the potential of Wav-KAN as a powerful tool for developing interpretable and high-performance neural networks, with applications spanning various fields. This work sets the stage for further exploration and implementation of Wav-KAN in frameworks such as PyTorch, TensorFlow, and also it makes wavelet in KAN in wide-spread usage like nowadays activation functions like ReLU, sigmoid in universal approximation theory (UAT).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Multilingualism and International Mental Health Research – The Barriers for Non-native Speakers of English The Oxford Olympics Study 2024: Are Cost and Cost Overrun at the Games Coming Down? The Frontiers of Nullification and Anticommandeering: Federalism and Extrajudicial Constitutional Interpretation Wasserstein gradient flow for optimal probability measure decomposition Using Legitimacy Strategies to Secure Organisational Survival Over Time: The Case of EFRAG
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1