{"title":"Lifting the curse of dimensionality: a random matrix-theoretic approach","authors":"T. Marzetta","doi":"10.1109/WIOPT.2009.5291553","DOIUrl":null,"url":null,"abstract":"The ubiquity of inexpensive sensors implies that we can measure vector-valued data of ever increasing dimension. But the number of independent measurements of the data vector is limited so the sample covariance matrix is usually singular. The traditional remedy for singularity is diagonal loading - the addition of a small identity matrix to make the covariance estimate invertible. An alternative to diagonal loading is to reduce the dimension of the data vectors to be smaller than the number of independent observations through an ensemble of isotropically random (Haar measure) unitary matrices. For every member of the unitary ensemble, the shortened data vectors yield a statistically meaningful, invertible covariance estimate from which we can compute an estimate for the ultimate desired quantity. The final step is to take the expectation of this estimate with respect to the unitary ensemble. For a class of applications that includes adaptive spectral estimation, the design of a linear estimator, and supervised learning the random matrix approach results in an estimate for the inverse covariance matrix which preserves the eigenvectors of the sample covariance matrix, but alters the eigenvalues in a nontrivial manner. A closed-form expression for the expectation over the unitary ensemble eludes us, but we have obtained a tractable asymptotic expression. Preliminary numerical results indicate considerable promise for this approach.","PeriodicalId":6630,"journal":{"name":"2017 15th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt)","volume":"70 3 1","pages":"1-2"},"PeriodicalIF":0.0000,"publicationDate":"2009-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 15th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WIOPT.2009.5291553","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The ubiquity of inexpensive sensors implies that we can measure vector-valued data of ever increasing dimension. But the number of independent measurements of the data vector is limited so the sample covariance matrix is usually singular. The traditional remedy for singularity is diagonal loading - the addition of a small identity matrix to make the covariance estimate invertible. An alternative to diagonal loading is to reduce the dimension of the data vectors to be smaller than the number of independent observations through an ensemble of isotropically random (Haar measure) unitary matrices. For every member of the unitary ensemble, the shortened data vectors yield a statistically meaningful, invertible covariance estimate from which we can compute an estimate for the ultimate desired quantity. The final step is to take the expectation of this estimate with respect to the unitary ensemble. For a class of applications that includes adaptive spectral estimation, the design of a linear estimator, and supervised learning the random matrix approach results in an estimate for the inverse covariance matrix which preserves the eigenvectors of the sample covariance matrix, but alters the eigenvalues in a nontrivial manner. A closed-form expression for the expectation over the unitary ensemble eludes us, but we have obtained a tractable asymptotic expression. Preliminary numerical results indicate considerable promise for this approach.