{"title":"Exploring Weight Distributions and Dependence in Neural Networks With $\\alpha$-Stable Distributions","authors":"Jipeng Li;Xueqiong Yuan;Ercan Engin Kuruoglu","doi":"10.1109/TAI.2024.3409673","DOIUrl":null,"url":null,"abstract":"The fundamental use of neural networks is in providing a nonlinear mapping between input and output data with possibly a high number of parameters that can be learned from data directly. Consequently, studying the model's parameters, particularly the weights, is of paramount importance. The distribution and interdependencies of these weights have a direct impact on the model's generalizability, compressibility, initialization, and convergence speed. By fitting the weights of pretrained neural networks using the \n<inline-formula><tex-math>$\\alpha$</tex-math></inline-formula>\n-stable distributions and conducting statistical tests, we discover widespread heavy-tailed phenomena in neural network weights, with a few layers exhibiting asymmetry. Additionally, we employ a multivariate \n<inline-formula><tex-math>$\\alpha$</tex-math></inline-formula>\n-stable distribution to model the weights and explore the relationship between weights within and across layers by calculating the signed symmetric covariation coefficient. The results reveal a strong dependence among certain weights. Our findings indicate that the Gaussian assumption, symmetry assumption, and independence assumption commonly used in neural network research might be inconsistent with reality. In conclusion, our research shows three properties observed in neural network weights: heavy-tailed phenomena, asymmetry, and dependence on certain weights.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"5 11","pages":"5519-5529"},"PeriodicalIF":0.0000,"publicationDate":"2024-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10549953/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The fundamental use of neural networks is in providing a nonlinear mapping between input and output data with possibly a high number of parameters that can be learned from data directly. Consequently, studying the model's parameters, particularly the weights, is of paramount importance. The distribution and interdependencies of these weights have a direct impact on the model's generalizability, compressibility, initialization, and convergence speed. By fitting the weights of pretrained neural networks using the
$\alpha$
-stable distributions and conducting statistical tests, we discover widespread heavy-tailed phenomena in neural network weights, with a few layers exhibiting asymmetry. Additionally, we employ a multivariate
$\alpha$
-stable distribution to model the weights and explore the relationship between weights within and across layers by calculating the signed symmetric covariation coefficient. The results reveal a strong dependence among certain weights. Our findings indicate that the Gaussian assumption, symmetry assumption, and independence assumption commonly used in neural network research might be inconsistent with reality. In conclusion, our research shows three properties observed in neural network weights: heavy-tailed phenomena, asymmetry, and dependence on certain weights.