{"title":"深度神经网络中产生的随机矩阵:一般I.I.D.案例","authors":"L. Pastur, V. Slavin","doi":"10.1142/s2010326322500460","DOIUrl":null,"url":null,"abstract":"We study the distribution of singular values of product of random matrices pertinent to the analysis of deep neural networks. The matrices resemble the product of the sample covariance matrices, however, an important difference is that the population covariance matrices assumed to be non-random or random but independent of the random data matrix in statistics and random matrix theory are now certain functions of random data matrices (synaptic weight matrices in the deep neural network terminology). The problem has been treated in recent work [25, 13] by using the techniques of free probability theory. Since, however, free probability theory deals with population covariance matrices which are independent of the data matrices, its applicability has to be justified. The justification has been given in [22] for Gaussian data matrices with independent entries, a standard analytical model of free probability, by using a version of the techniques of random matrix theory. In this paper we use another, more streamlined, version of the techniques of random matrix theory to generalize the results of [22] to the case where the entries of the synaptic weight matrices are just independent identically distributed random variables with zero mean and finite fourth moment. This, in particular, extends the property of the so-called macroscopic universality on the considered random matrices.","PeriodicalId":54329,"journal":{"name":"Random Matrices-Theory and Applications","volume":"13 1","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2020-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"On Random Matrices Arising in Deep Neural Networks: General I.I.D. Case\",\"authors\":\"L. Pastur, V. Slavin\",\"doi\":\"10.1142/s2010326322500460\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We study the distribution of singular values of product of random matrices pertinent to the analysis of deep neural networks. The matrices resemble the product of the sample covariance matrices, however, an important difference is that the population covariance matrices assumed to be non-random or random but independent of the random data matrix in statistics and random matrix theory are now certain functions of random data matrices (synaptic weight matrices in the deep neural network terminology). The problem has been treated in recent work [25, 13] by using the techniques of free probability theory. Since, however, free probability theory deals with population covariance matrices which are independent of the data matrices, its applicability has to be justified. The justification has been given in [22] for Gaussian data matrices with independent entries, a standard analytical model of free probability, by using a version of the techniques of random matrix theory. In this paper we use another, more streamlined, version of the techniques of random matrix theory to generalize the results of [22] to the case where the entries of the synaptic weight matrices are just independent identically distributed random variables with zero mean and finite fourth moment. This, in particular, extends the property of the so-called macroscopic universality on the considered random matrices.\",\"PeriodicalId\":54329,\"journal\":{\"name\":\"Random Matrices-Theory and Applications\",\"volume\":\"13 1\",\"pages\":\"\"},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2020-11-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Random Matrices-Theory and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1142/s2010326322500460\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"PHYSICS, MATHEMATICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Random Matrices-Theory and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1142/s2010326322500460","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"PHYSICS, MATHEMATICAL","Score":null,"Total":0}
On Random Matrices Arising in Deep Neural Networks: General I.I.D. Case
We study the distribution of singular values of product of random matrices pertinent to the analysis of deep neural networks. The matrices resemble the product of the sample covariance matrices, however, an important difference is that the population covariance matrices assumed to be non-random or random but independent of the random data matrix in statistics and random matrix theory are now certain functions of random data matrices (synaptic weight matrices in the deep neural network terminology). The problem has been treated in recent work [25, 13] by using the techniques of free probability theory. Since, however, free probability theory deals with population covariance matrices which are independent of the data matrices, its applicability has to be justified. The justification has been given in [22] for Gaussian data matrices with independent entries, a standard analytical model of free probability, by using a version of the techniques of random matrix theory. In this paper we use another, more streamlined, version of the techniques of random matrix theory to generalize the results of [22] to the case where the entries of the synaptic weight matrices are just independent identically distributed random variables with zero mean and finite fourth moment. This, in particular, extends the property of the so-called macroscopic universality on the considered random matrices.
期刊介绍:
Random Matrix Theory (RMT) has a long and rich history and has, especially in recent years, shown to have important applications in many diverse areas of mathematics, science, and engineering. The scope of RMT and its applications include the areas of classical analysis, probability theory, statistical analysis of big data, as well as connections to graph theory, number theory, representation theory, and many areas of mathematical physics.
Applications of Random Matrix Theory continue to present themselves and new applications are welcome in this journal. Some examples are orthogonal polynomial theory, free probability, integrable systems, growth models, wireless communications, signal processing, numerical computing, complex networks, economics, statistical mechanics, and quantum theory.
Special issues devoted to single topic of current interest will also be considered and published in this journal.