The Local Landscape of Phase Retrieval Under Limited Samples

IF 2.2 3区 计算机科学 Q3 COMPUTER SCIENCE, INFORMATION SYSTEMS IEEE Transactions on Information Theory Pub Date : 2024-10-15 DOI:10.1109/TIT.2024.3481269
Kaizhao Liu;Zihao Wang;Lei Wu
{"title":"The Local Landscape of Phase Retrieval Under Limited Samples","authors":"Kaizhao Liu;Zihao Wang;Lei Wu","doi":"10.1109/TIT.2024.3481269","DOIUrl":null,"url":null,"abstract":"We present a fine-grained analysis of the local landscape of phase retrieval under the regime of limited samples. Specifically, we aim to ascertain the minimal sample size required to guarantee a benign local landscape surrounding global minima in high dimensions. Let n and d denote the sample size and input dimension, respectively. We first explore the local convexity and establish that when \n<inline-formula> <tex-math>$n=o(d\\log d)$ </tex-math></inline-formula>\n, for almost every fixed point in the local ball, the Hessian matrix has negative eigenvalues, provided d is sufficiently large. We next consider the one-point convexity and show that, as long as \n<inline-formula> <tex-math>$n=\\omega (d)$ </tex-math></inline-formula>\n, with high probability, the landscape is one-point strongly convex in the local annulus: \n<inline-formula> <tex-math>$\\{w\\in \\mathbb {R}^{d}: o_{d}({1})\\leqslant \\|w-w^{*}\\|\\leqslant c\\}$ </tex-math></inline-formula>\n, where \n<inline-formula> <tex-math>$w^{*}$ </tex-math></inline-formula>\n is the ground truth and c is an absolute constant. This implies that gradient descent, initialized from any point in this domain, can converge to an \n<inline-formula> <tex-math>$o_{d}({1})$ </tex-math></inline-formula>\n-loss solution exponentially fast. Furthermore, we show that when \n<inline-formula> <tex-math>$n=o(d\\log d)$ </tex-math></inline-formula>\n, there is a radius of \n<inline-formula> <tex-math>$\\widetilde {\\Theta } \\left ({{\\sqrt {1/d}}}\\right)$ </tex-math></inline-formula>\n such that one-point convexity breaks down in the corresponding smaller local ball. This indicates an impossibility to establish a convergence to the exact \n<inline-formula> <tex-math>$w^{*}$ </tex-math></inline-formula>\n for gradient descent under limited samples by relying solely on one-point convexity.","PeriodicalId":13494,"journal":{"name":"IEEE Transactions on Information Theory","volume":"70 12","pages":"9012-9035"},"PeriodicalIF":2.2000,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Information Theory","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10718309/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

We present a fine-grained analysis of the local landscape of phase retrieval under the regime of limited samples. Specifically, we aim to ascertain the minimal sample size required to guarantee a benign local landscape surrounding global minima in high dimensions. Let n and d denote the sample size and input dimension, respectively. We first explore the local convexity and establish that when $n=o(d\log d)$ , for almost every fixed point in the local ball, the Hessian matrix has negative eigenvalues, provided d is sufficiently large. We next consider the one-point convexity and show that, as long as $n=\omega (d)$ , with high probability, the landscape is one-point strongly convex in the local annulus: $\{w\in \mathbb {R}^{d}: o_{d}({1})\leqslant \|w-w^{*}\|\leqslant c\}$ , where $w^{*}$ is the ground truth and c is an absolute constant. This implies that gradient descent, initialized from any point in this domain, can converge to an $o_{d}({1})$ -loss solution exponentially fast. Furthermore, we show that when $n=o(d\log d)$ , there is a radius of $\widetilde {\Theta } \left ({{\sqrt {1/d}}}\right)$ such that one-point convexity breaks down in the corresponding smaller local ball. This indicates an impossibility to establish a convergence to the exact $w^{*}$ for gradient descent under limited samples by relying solely on one-point convexity.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
有限样本下的相位检索局部景观
我们对有限样本条件下的相位检索局部景观进行了精细分析。具体来说,我们的目标是确定保证高维度全局最小值周围良性局部景观所需的最小样本量。让 n 和 d 分别表示样本量和输入维度。我们首先探讨局部凸性,并确定当 $n=o(d\log d)$ 时,只要 d 足够大,对于局部球中的几乎每个定点,Hessian 矩阵都有负特征值。接下来我们考虑单点凸性,结果表明,只要 $n=\omega (d)$ ,景观就很有可能在局部环内是单点强凸的:$\{w\in \mathbb {R}^{d}: o_{d}({1})\leqslant \|w-w^{*}\|\leqslant c\}$ ,其中 $w^{*}$ 是基本事实,c 是绝对常量。这意味着,从该域中的任意点初始化的梯度下降法可以以指数级的速度收敛到 $o_{d}({1})$ 损失解。此外,我们还证明,当 $n=o(d\log d)$ 时,有一个半径为 $\widetilde {\Theta } 的区域。\left ({{\sqrt {1/d}}}\right)$ ,这样单点凸性就会在相应的较小局部球中崩溃。这表明在有限样本下,仅依靠一点凸性是不可能建立梯度下降对精确 $w^{*}$ 的收敛的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory 工程技术-工程:电子与电气
CiteScore
5.70
自引率
20.00%
发文量
514
审稿时长
12 months
期刊介绍: The IEEE Transactions on Information Theory is a journal that publishes theoretical and experimental papers concerned with the transmission, processing, and utilization of information. The boundaries of acceptable subject matter are intentionally not sharply delimited. Rather, it is hoped that as the focus of research activity changes, a flexible policy will permit this Transactions to follow suit. Current appropriate topics are best reflected by recent Tables of Contents; they are summarized in the titles of editorial areas that appear on the inside front cover.
期刊最新文献
Table of Contents IEEE Transactions on Information Theory Information for Authors IEEE Transactions on Information Theory Publication Information Reliable Computation by Large-Alphabet Formulas in the Presence of Noise Capacity Results for the Wiretapped Oblivious Transfer
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1