非凸弱光滑势的未调整朗格文算法

IF 1.1 4区 数学 Q1 MATHEMATICS Communications in Mathematics and Statistics Pub Date : 2023-12-09 DOI:10.1007/s40304-023-00350-w
Dao Nguyen, Xin Dang, Yixin Chen
{"title":"非凸弱光滑势的未调整朗格文算法","authors":"Dao Nguyen, Xin Dang, Yixin Chen","doi":"10.1007/s40304-023-00350-w","DOIUrl":null,"url":null,"abstract":"<p>Discretization of continuous-time diffusion processes is a widely recognized method for sampling. However, the canonical Euler Maruyama discretization of the Langevin diffusion process, referred as unadjusted Langevin algorithm (ULA), studied mostly in the context of smooth (gradient Lipschitz) and strongly log-concave densities, is a considerable hindrance for its deployment in many sciences, including statistics and machine learning. In this paper, we establish several theoretical contributions to the literature on such sampling methods for non-convex distributions. Particularly, we introduce a new mixture weakly smooth condition, under which we prove that ULA will converge with additional log-Sobolev inequality. We also show that ULA for smoothing potential will converge in <span>\\(L_{2}\\)</span>-Wasserstein distance. Moreover, using convexification of nonconvex domain (Ma et al. in Proc Natl Acad Sci 116(42):20881–20885, 2019) in combination with regularization, we establish the convergence in Kullback–Leibler divergence with the number of iterations to reach <span>\\(\\epsilon \\)</span>-neighborhood of a target distribution in only polynomial dependence on the dimension. We relax the conditions of Vempala and Wibisono (Advances in Neural Information Processing Systems, 2019) and prove convergence guarantees under isoperimetry, and non-strongly convex at infinity.</p>","PeriodicalId":10575,"journal":{"name":"Communications in Mathematics and Statistics","volume":null,"pages":null},"PeriodicalIF":1.1000,"publicationDate":"2023-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Unadjusted Langevin Algorithm for Non-convex Weakly Smooth Potentials\",\"authors\":\"Dao Nguyen, Xin Dang, Yixin Chen\",\"doi\":\"10.1007/s40304-023-00350-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Discretization of continuous-time diffusion processes is a widely recognized method for sampling. However, the canonical Euler Maruyama discretization of the Langevin diffusion process, referred as unadjusted Langevin algorithm (ULA), studied mostly in the context of smooth (gradient Lipschitz) and strongly log-concave densities, is a considerable hindrance for its deployment in many sciences, including statistics and machine learning. In this paper, we establish several theoretical contributions to the literature on such sampling methods for non-convex distributions. Particularly, we introduce a new mixture weakly smooth condition, under which we prove that ULA will converge with additional log-Sobolev inequality. We also show that ULA for smoothing potential will converge in <span>\\\\(L_{2}\\\\)</span>-Wasserstein distance. Moreover, using convexification of nonconvex domain (Ma et al. in Proc Natl Acad Sci 116(42):20881–20885, 2019) in combination with regularization, we establish the convergence in Kullback–Leibler divergence with the number of iterations to reach <span>\\\\(\\\\epsilon \\\\)</span>-neighborhood of a target distribution in only polynomial dependence on the dimension. We relax the conditions of Vempala and Wibisono (Advances in Neural Information Processing Systems, 2019) and prove convergence guarantees under isoperimetry, and non-strongly convex at infinity.</p>\",\"PeriodicalId\":10575,\"journal\":{\"name\":\"Communications in Mathematics and Statistics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.1000,\"publicationDate\":\"2023-12-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Communications in Mathematics and Statistics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s40304-023-00350-w\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications in Mathematics and Statistics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s40304-023-00350-w","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

摘要

连续时间扩散过程的离散化是一种公认的采样方法。然而,Langevin 扩散过程的典型 Euler Maruyama 离散化,即未调整 Langevin 算法(ULA),主要是在光滑(梯度 Lipschitz)和强对数凹密度的背景下研究的,这对其在包括统计和机器学习在内的许多科学领域的应用是一个相当大的障碍。在本文中,我们为有关非凸分布的此类采样方法的文献做出了一些理论贡献。特别是,我们引入了一个新的混合物弱光滑条件,在此条件下,我们证明了 ULA 将以额外的 log-Sobolev 不等式收敛。我们还证明了平滑势的 ULA 会以 \(L_{2}\)-Wasserstein 距离收敛。此外,利用非凸域的凸化(Ma et al. in Proc Natl Acad Sci 116(42):20881-20885, 2019)结合正则化,我们建立了库尔贝-莱布勒发散的收敛性,达到目标分布的\(\epsilon \)-邻域的迭代次数仅与维度的多项式相关。我们放宽了 Vempala 和 Wibisono(《神经信息处理系统进展》,2019 年)的条件,并证明了等距下的收敛保证,以及无穷大处的非强凸性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Unadjusted Langevin Algorithm for Non-convex Weakly Smooth Potentials

Discretization of continuous-time diffusion processes is a widely recognized method for sampling. However, the canonical Euler Maruyama discretization of the Langevin diffusion process, referred as unadjusted Langevin algorithm (ULA), studied mostly in the context of smooth (gradient Lipschitz) and strongly log-concave densities, is a considerable hindrance for its deployment in many sciences, including statistics and machine learning. In this paper, we establish several theoretical contributions to the literature on such sampling methods for non-convex distributions. Particularly, we introduce a new mixture weakly smooth condition, under which we prove that ULA will converge with additional log-Sobolev inequality. We also show that ULA for smoothing potential will converge in \(L_{2}\)-Wasserstein distance. Moreover, using convexification of nonconvex domain (Ma et al. in Proc Natl Acad Sci 116(42):20881–20885, 2019) in combination with regularization, we establish the convergence in Kullback–Leibler divergence with the number of iterations to reach \(\epsilon \)-neighborhood of a target distribution in only polynomial dependence on the dimension. We relax the conditions of Vempala and Wibisono (Advances in Neural Information Processing Systems, 2019) and prove convergence guarantees under isoperimetry, and non-strongly convex at infinity.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Communications in Mathematics and Statistics
Communications in Mathematics and Statistics Mathematics-Statistics and Probability
CiteScore
1.80
自引率
0.00%
发文量
36
期刊介绍: Communications in Mathematics and Statistics is an international journal published by Springer-Verlag in collaboration with the School of Mathematical Sciences, University of Science and Technology of China (USTC). The journal will be committed to publish high level original peer reviewed research papers in various areas of mathematical sciences, including pure mathematics, applied mathematics, computational mathematics, and probability and statistics. Typically one volume is published each year, and each volume consists of four issues.
期刊最新文献
Inference for Partially Linear Quantile Regression Models in Ultrahigh Dimension Stopping Levels for a Spectrally Negative Markov Additive Process Characterization of Graphs with Some Normalized Laplacian Eigenvalue Having Multiplicity $$n{-}4$$ Three Favorite Edges Occurs Infinitely Often for One-Dimensional Simple Random Walk Equivalence Assessment via the Difference Between Two AUCs in a Matched-Pair Design with Nonignorable Missing Endpoints
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1