{"title":"A unified Fourier slice method to derive ridgelet transform for a variety of depth-2 neural networks","authors":"Sho Sonoda , Isao Ishikawa , Masahiro Ikeda","doi":"10.1016/j.jspi.2024.106184","DOIUrl":null,"url":null,"abstract":"<div><p>To investigate neural network parameters, it is easier to study the distribution of parameters than to study the parameters in each neuron. The ridgelet transform is a pseudo-inverse operator that maps a given function <span><math><mi>f</mi></math></span> to the parameter distribution <span><math><mi>γ</mi></math></span> so that a network <span><math><mrow><mstyle><mi>N</mi><mi>N</mi></mstyle><mrow><mo>[</mo><mi>γ</mi><mo>]</mo></mrow></mrow></math></span> reproduces <span><math><mi>f</mi></math></span>, i.e. <span><math><mrow><mstyle><mi>N</mi><mi>N</mi></mstyle><mrow><mo>[</mo><mi>γ</mi><mo>]</mo></mrow><mo>=</mo><mi>f</mi></mrow></math></span>. For depth-2 fully-connected networks on a Euclidean space, the ridgelet transform has been discovered up to the closed-form expression, thus we could describe how the parameters are distributed. However, for a variety of modern neural network architectures, the closed-form expression has not been known. In this paper, we explain a systematic method using Fourier expressions to derive ridgelet transforms for a variety of modern networks such as networks on finite fields <span><math><msub><mrow><mi>F</mi></mrow><mrow><mi>p</mi></mrow></msub></math></span>, group convolutional networks on abstract Hilbert space <span><math><mi>H</mi></math></span>, fully-connected networks on noncompact symmetric spaces <span><math><mrow><mi>G</mi><mo>/</mo><mi>K</mi></mrow></math></span>, and pooling layers, or the <span><math><mi>d</mi></math></span>-plane ridgelet transform.</p></div>","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"233 ","pages":"Article 106184"},"PeriodicalIF":0.8000,"publicationDate":"2024-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0378375824000417/pdfft?md5=98e3c89ff86925f67f13c56d174f0109&pid=1-s2.0-S0378375824000417-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375824000417","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
To investigate neural network parameters, it is easier to study the distribution of parameters than to study the parameters in each neuron. The ridgelet transform is a pseudo-inverse operator that maps a given function to the parameter distribution so that a network reproduces , i.e. . For depth-2 fully-connected networks on a Euclidean space, the ridgelet transform has been discovered up to the closed-form expression, thus we could describe how the parameters are distributed. However, for a variety of modern neural network architectures, the closed-form expression has not been known. In this paper, we explain a systematic method using Fourier expressions to derive ridgelet transforms for a variety of modern networks such as networks on finite fields , group convolutional networks on abstract Hilbert space , fully-connected networks on noncompact symmetric spaces , and pooling layers, or the -plane ridgelet transform.
期刊介绍:
The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists.
We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.