{"title":"不对称有助于:不对称扰动低阶矩阵的特征值和特征向量分析。","authors":"Yuxin Chen, Chen Cheng, Jianqing Fan","doi":"10.1214/20-aos1963","DOIUrl":null,"url":null,"abstract":"<p><p>This paper is concerned with the interplay between statistical asymmetry and spectral methods. Suppose we are interested in estimating a rank-1 and symmetric matrix <math> <mrow> <msup><mstyle><mi>M</mi></mstyle> <mo>⋆</mo></msup> <mo>∈</mo> <msup><mi>ℝ</mi> <mrow><mi>n</mi> <mo>×</mo> <mi>n</mi></mrow> </msup> </mrow> </math> , yet only a randomly perturbed version <b><i>M</i></b> is observed. The noise matrix <b><i>M</i></b> - <b><i>M</i></b> <sup>⋆</sup> is composed of independent (but not necessarily homoscedastic) entries and is, therefore, not symmetric in general. This might arise if, for example, we have two independent samples for each entry of <b><i>M</i></b> <sup>⋆</sup> and arrange them in an <i>asymmetric</i> fashion. The aim is to estimate the leading eigenvalue and the leading eigenvector of <b><i>M</i></b> <sup>⋆</sup>. We demonstrate that the leading eigenvalue of the data matrix <b><i>M</i></b> can be <math><mrow><mi>O</mi> <mo>(</mo> <msqrt><mi>n</mi></msqrt> <mo>)</mo></mrow> </math> times more accurate (up to some log factor) than its (unadjusted) leading singular value of <b><i>M</i></b> in eigenvalue estimation. Moreover, the eigen-decomposition approach is fully adaptive to heteroscedasticity of noise, without the need of any prior knowledge about the noise distributions. In a nutshell, this curious phenomenon arises since the statistical asymmetry automatically mitigates the bias of the eigenvalue approach, thus eliminating the need of careful bias correction. Additionally, we develop appealing non-asymptotic eigenvector perturbation bounds; in particular, we are able to bound the perterbation of any linear function of the leading eigenvector of <b><i>M</i></b> (e.g. entrywise eigenvector perturbation). We also provide partial theory for the more general rank-<i>r</i> case. The takeaway message is this: arranging the data samples in an asymmetric manner and performing eigen-decomposition could sometimes be quite beneficial.</p>","PeriodicalId":8032,"journal":{"name":"Annals of Statistics","volume":"49 1","pages":"435-458"},"PeriodicalIF":3.2000,"publicationDate":"2021-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8300484/pdf/nihms-1639565.pdf","citationCount":"0","resultStr":"{\"title\":\"ASYMMETRY HELPS: EIGENVALUE AND EIGENVECTOR ANALYSES OF ASYMMETRICALLY PERTURBED LOW-RANK MATRICES.\",\"authors\":\"Yuxin Chen, Chen Cheng, Jianqing Fan\",\"doi\":\"10.1214/20-aos1963\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>This paper is concerned with the interplay between statistical asymmetry and spectral methods. Suppose we are interested in estimating a rank-1 and symmetric matrix <math> <mrow> <msup><mstyle><mi>M</mi></mstyle> <mo>⋆</mo></msup> <mo>∈</mo> <msup><mi>ℝ</mi> <mrow><mi>n</mi> <mo>×</mo> <mi>n</mi></mrow> </msup> </mrow> </math> , yet only a randomly perturbed version <b><i>M</i></b> is observed. The noise matrix <b><i>M</i></b> - <b><i>M</i></b> <sup>⋆</sup> is composed of independent (but not necessarily homoscedastic) entries and is, therefore, not symmetric in general. This might arise if, for example, we have two independent samples for each entry of <b><i>M</i></b> <sup>⋆</sup> and arrange them in an <i>asymmetric</i> fashion. The aim is to estimate the leading eigenvalue and the leading eigenvector of <b><i>M</i></b> <sup>⋆</sup>. We demonstrate that the leading eigenvalue of the data matrix <b><i>M</i></b> can be <math><mrow><mi>O</mi> <mo>(</mo> <msqrt><mi>n</mi></msqrt> <mo>)</mo></mrow> </math> times more accurate (up to some log factor) than its (unadjusted) leading singular value of <b><i>M</i></b> in eigenvalue estimation. Moreover, the eigen-decomposition approach is fully adaptive to heteroscedasticity of noise, without the need of any prior knowledge about the noise distributions. In a nutshell, this curious phenomenon arises since the statistical asymmetry automatically mitigates the bias of the eigenvalue approach, thus eliminating the need of careful bias correction. Additionally, we develop appealing non-asymptotic eigenvector perturbation bounds; in particular, we are able to bound the perterbation of any linear function of the leading eigenvector of <b><i>M</i></b> (e.g. entrywise eigenvector perturbation). We also provide partial theory for the more general rank-<i>r</i> case. The takeaway message is this: arranging the data samples in an asymmetric manner and performing eigen-decomposition could sometimes be quite beneficial.</p>\",\"PeriodicalId\":8032,\"journal\":{\"name\":\"Annals of Statistics\",\"volume\":\"49 1\",\"pages\":\"435-458\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2021-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8300484/pdf/nihms-1639565.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Annals of Statistics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1214/20-aos1963\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2021/1/29 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annals of Statistics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1214/20-aos1963","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2021/1/29 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
摘要
本文关注统计不对称与光谱方法之间的相互作用。假设我们有兴趣估算一个秩为 1 的对称矩阵 M ⋆ ∈ ℝ n × n,但只观测到随机扰动版本的 M。噪声矩阵 M - M ⋆ 由独立(但不一定是同源)条目组成,因此一般不是对称的。例如,如果我们对 M ⋆ 的每个条目都有两个独立样本,并以非对称方式排列,就可能出现这种情况。我们的目的是估计 M ⋆ 的前导特征值和前导特征向量。我们证明,在特征值估计中,数据矩阵 M 的前导特征值比 M 的(未调整的)前导奇异值精确 O ( n ) 倍(达到某个对数因子)。此外,特征分解方法还能完全适应噪声的异方差性,而无需任何关于噪声分布的先验知识。简而言之,这种奇特现象的出现是因为统计不对称自动减轻了特征值方法的偏差,从而无需进行仔细的偏差校正。此外,我们还开发了具有吸引力的非渐近特征向量扰动约束;特别是,我们能够约束 M 的前导特征向量的任何线性函数的扰动(例如入口特征向量扰动)。我们还为更一般的秩r情况提供了部分理论。我们的启示是:以非对称方式排列数据样本并进行特征分解有时会非常有益。
ASYMMETRY HELPS: EIGENVALUE AND EIGENVECTOR ANALYSES OF ASYMMETRICALLY PERTURBED LOW-RANK MATRICES.
This paper is concerned with the interplay between statistical asymmetry and spectral methods. Suppose we are interested in estimating a rank-1 and symmetric matrix , yet only a randomly perturbed version M is observed. The noise matrix M - M⋆ is composed of independent (but not necessarily homoscedastic) entries and is, therefore, not symmetric in general. This might arise if, for example, we have two independent samples for each entry of M⋆ and arrange them in an asymmetric fashion. The aim is to estimate the leading eigenvalue and the leading eigenvector of M⋆. We demonstrate that the leading eigenvalue of the data matrix M can be times more accurate (up to some log factor) than its (unadjusted) leading singular value of M in eigenvalue estimation. Moreover, the eigen-decomposition approach is fully adaptive to heteroscedasticity of noise, without the need of any prior knowledge about the noise distributions. In a nutshell, this curious phenomenon arises since the statistical asymmetry automatically mitigates the bias of the eigenvalue approach, thus eliminating the need of careful bias correction. Additionally, we develop appealing non-asymptotic eigenvector perturbation bounds; in particular, we are able to bound the perterbation of any linear function of the leading eigenvector of M (e.g. entrywise eigenvector perturbation). We also provide partial theory for the more general rank-r case. The takeaway message is this: arranging the data samples in an asymmetric manner and performing eigen-decomposition could sometimes be quite beneficial.
期刊介绍:
The Annals of Statistics aim to publish research papers of highest quality reflecting the many facets of contemporary statistics. Primary emphasis is placed on importance and originality, not on formalism. The journal aims to cover all areas of statistics, especially mathematical statistics and applied & interdisciplinary statistics. Of course many of the best papers will touch on more than one of these general areas, because the discipline of statistics has deep roots in mathematics, and in substantive scientific fields.