The Local Landscape of Phase Retrieval Under Limited Samples

Kaizhao Liu, Zihao Wang, Lei Wu
{"title":"The Local Landscape of Phase Retrieval Under Limited Samples","authors":"Kaizhao Liu, Zihao Wang, Lei Wu","doi":"arxiv-2311.15221","DOIUrl":null,"url":null,"abstract":"In this paper, we provide a fine-grained analysis of the local landscape of\nphase retrieval under the regime with limited samples. Our aim is to ascertain\nthe minimal sample size necessary to guarantee a benign local landscape\nsurrounding global minima in high dimensions. Let $n$ and $d$ denote the sample\nsize and input dimension, respectively. We first explore the local convexity\nand establish that when $n=o(d\\log d)$, for almost every fixed point in the\nlocal ball, the Hessian matrix must have negative eigenvalues as long as $d$ is\nsufficiently large. Consequently, the local landscape is highly non-convex. We\nnext consider the one-point strong convexity and show that as long as\n$n=\\omega(d)$, with high probability, the landscape is one-point strongly\nconvex in the local annulus: $\\{w\\in\\mathbb{R}^d: o_d(1)\\leqslant\n\\|w-w^*\\|\\leqslant c\\}$, where $w^*$ is the ground truth and $c$ is an absolute\nconstant. This implies that gradient descent initialized from any point in this\ndomain can converge to an $o_d(1)$-loss solution exponentially fast.\nFurthermore, we show that when $n=o(d\\log d)$, there is a radius of\n$\\widetilde\\Theta\\left(\\sqrt{1/d}\\right)$ such that one-point convexity breaks\nin the corresponding smaller local ball. This indicates an impossibility to\nestablish a convergence to exact $w^*$ for gradient descent under limited\nsamples by relying solely on one-point convexity.","PeriodicalId":501330,"journal":{"name":"arXiv - MATH - Statistics Theory","volume":"55 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2311.15221","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we provide a fine-grained analysis of the local landscape of phase retrieval under the regime with limited samples. Our aim is to ascertain the minimal sample size necessary to guarantee a benign local landscape surrounding global minima in high dimensions. Let $n$ and $d$ denote the sample size and input dimension, respectively. We first explore the local convexity and establish that when $n=o(d\log d)$, for almost every fixed point in the local ball, the Hessian matrix must have negative eigenvalues as long as $d$ is sufficiently large. Consequently, the local landscape is highly non-convex. We next consider the one-point strong convexity and show that as long as $n=\omega(d)$, with high probability, the landscape is one-point strongly convex in the local annulus: $\{w\in\mathbb{R}^d: o_d(1)\leqslant \|w-w^*\|\leqslant c\}$, where $w^*$ is the ground truth and $c$ is an absolute constant. This implies that gradient descent initialized from any point in this domain can converge to an $o_d(1)$-loss solution exponentially fast. Furthermore, we show that when $n=o(d\log d)$, there is a radius of $\widetilde\Theta\left(\sqrt{1/d}\right)$ such that one-point convexity breaks in the corresponding smaller local ball. This indicates an impossibility to establish a convergence to exact $w^*$ for gradient descent under limited samples by relying solely on one-point convexity.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
有限样本条件下相位检索的局部图景
在本文中,我们提供了一个细粒度的分析,在有限的样本制度下,相位检索的局部景观。我们的目标是确定必要的最小样本量,以保证一个良性的局部景观周围的高维全球最小值。设$n$和$d$分别表示样本量和输入维度。我们首先探索了局部凸性,并建立了当$n=o(d\log d)$时,对于几乎每个局部球上的不动点,只要$d$足够大,Hessian矩阵必须具有负特征值。因此,当地景观是高度非凸的。接下来考虑一点强凸性,并表明只要$n=\omega(d)$,在高概率下,景观在局部环空是一点强凸的:$\{w\in\mathbb{R}^d: o_d(1)\leqslant\|w-w^*\|\leqslant c\}$,其中$w^*$是基本真理,$c$是绝对常数。这意味着从该域中任何点初始化的梯度下降都可以以指数级速度收敛到$o_d(1)$ -loss解。进一步,我们证明了当$n=o(d\log d)$时,存在一个半径$\widetilde\Theta\left(\sqrt{1/d}\right)$,使得相应的较小的局部球的一点凸性破裂。这表明,在有限样本下,仅依靠一点凸性,不可能建立精确$w^*$梯度下降的收敛性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Precision-based designs for sequential randomized experiments Strang Splitting for Parametric Inference in Second-order Stochastic Differential Equations Stability of a Generalized Debiased Lasso with Applications to Resampling-Based Variable Selection Tuning parameter selection in econometrics Limiting Behavior of Maxima under Dependence
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1