Wasserstein gan是极大极小最优分布估计

Arthur Stéphanovitch, Eddie Aamari, Clément Levrard
{"title":"Wasserstein gan是极大极小最优分布估计","authors":"Arthur Stéphanovitch, Eddie Aamari, Clément Levrard","doi":"arxiv-2311.18613","DOIUrl":null,"url":null,"abstract":"We provide non asymptotic rates of convergence of the Wasserstein Generative\nAdversarial networks (WGAN) estimator. We build neural networks classes\nrepresenting the generators and discriminators which yield a GAN that achieves\nthe minimax optimal rate for estimating a certain probability measure $\\mu$\nwith support in $\\mathbb{R}^p$. The probability $\\mu$ is considered to be the\npush forward of the Lebesgue measure on the $d$-dimensional torus\n$\\mathbb{T}^d$ by a map $g^\\star:\\mathbb{T}^d\\rightarrow \\mathbb{R}^p$ of\nsmoothness $\\beta+1$. Measuring the error with the $\\gamma$-H\\\"older Integral\nProbability Metric (IPM), we obtain up to logarithmic factors, the minimax\noptimal rate $O(n^{-\\frac{\\beta+\\gamma}{2\\beta +d}}\\vee n^{-\\frac{1}{2}})$\nwhere $n$ is the sample size, $\\beta$ determines the smoothness of the target\nmeasure $\\mu$, $\\gamma$ is the smoothness of the IPM ($\\gamma=1$ is the\nWasserstein case) and $d\\leq p$ is the intrinsic dimension of $\\mu$. In the\nprocess, we derive a sharp interpolation inequality between H\\\"older IPMs. This\nnovel result of theory of functions spaces generalizes classical interpolation\ninequalities to the case where the measures involved have densities on\ndifferent manifolds.","PeriodicalId":501330,"journal":{"name":"arXiv - MATH - Statistics Theory","volume":"90 4","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Wasserstein GANs are Minimax Optimal Distribution Estimators\",\"authors\":\"Arthur Stéphanovitch, Eddie Aamari, Clément Levrard\",\"doi\":\"arxiv-2311.18613\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We provide non asymptotic rates of convergence of the Wasserstein Generative\\nAdversarial networks (WGAN) estimator. We build neural networks classes\\nrepresenting the generators and discriminators which yield a GAN that achieves\\nthe minimax optimal rate for estimating a certain probability measure $\\\\mu$\\nwith support in $\\\\mathbb{R}^p$. The probability $\\\\mu$ is considered to be the\\npush forward of the Lebesgue measure on the $d$-dimensional torus\\n$\\\\mathbb{T}^d$ by a map $g^\\\\star:\\\\mathbb{T}^d\\\\rightarrow \\\\mathbb{R}^p$ of\\nsmoothness $\\\\beta+1$. Measuring the error with the $\\\\gamma$-H\\\\\\\"older Integral\\nProbability Metric (IPM), we obtain up to logarithmic factors, the minimax\\noptimal rate $O(n^{-\\\\frac{\\\\beta+\\\\gamma}{2\\\\beta +d}}\\\\vee n^{-\\\\frac{1}{2}})$\\nwhere $n$ is the sample size, $\\\\beta$ determines the smoothness of the target\\nmeasure $\\\\mu$, $\\\\gamma$ is the smoothness of the IPM ($\\\\gamma=1$ is the\\nWasserstein case) and $d\\\\leq p$ is the intrinsic dimension of $\\\\mu$. In the\\nprocess, we derive a sharp interpolation inequality between H\\\\\\\"older IPMs. This\\nnovel result of theory of functions spaces generalizes classical interpolation\\ninequalities to the case where the measures involved have densities on\\ndifferent manifolds.\",\"PeriodicalId\":501330,\"journal\":{\"name\":\"arXiv - MATH - Statistics Theory\",\"volume\":\"90 4\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Statistics Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2311.18613\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2311.18613","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

我们提供了Wasserstein生成对抗网络(WGAN)估计器的非渐近收敛率。我们构建了代表生成器和鉴别器的神经网络类,这些神经网络类产生了一个GAN,该GAN在$\mathbb{R}^p$的支持下实现了估计某个概率度量$\mu$的最小最大最优速率。概率$\mu$被认为是勒贝格测度在$d$维环面$\mathbb{T}^d$上通过光滑度$\beta+1$的映射$g^\star:\mathbb{T}^d\rightarrow \mathbb{R}^p$向前推进。利用$\gamma$ -Hölder积分概率度量(IntegralProbability Metric, IPM)测量误差,我们得到了至多对数因子,最小最优率$O(n^{-\frac{\beta+\gamma}{2\beta +d}}\vee n^{-\frac{1}{2}})$其中$n$为样本量,$\beta$确定了targetmeasure的平滑度$\mu$, $\gamma$为IPM的平滑度($\gamma=1$为wasserstein情况),$d\leq p$为$\mu$的固有维数。在此过程中,我们得到了Hölder ipm之间的尖锐插值不等式。这个函数空间理论的新结果将经典插值不等式推广到所涉及的测度在不同流形上具有密度的情况。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Wasserstein GANs are Minimax Optimal Distribution Estimators
We provide non asymptotic rates of convergence of the Wasserstein Generative Adversarial networks (WGAN) estimator. We build neural networks classes representing the generators and discriminators which yield a GAN that achieves the minimax optimal rate for estimating a certain probability measure $\mu$ with support in $\mathbb{R}^p$. The probability $\mu$ is considered to be the push forward of the Lebesgue measure on the $d$-dimensional torus $\mathbb{T}^d$ by a map $g^\star:\mathbb{T}^d\rightarrow \mathbb{R}^p$ of smoothness $\beta+1$. Measuring the error with the $\gamma$-H\"older Integral Probability Metric (IPM), we obtain up to logarithmic factors, the minimax optimal rate $O(n^{-\frac{\beta+\gamma}{2\beta +d}}\vee n^{-\frac{1}{2}})$ where $n$ is the sample size, $\beta$ determines the smoothness of the target measure $\mu$, $\gamma$ is the smoothness of the IPM ($\gamma=1$ is the Wasserstein case) and $d\leq p$ is the intrinsic dimension of $\mu$. In the process, we derive a sharp interpolation inequality between H\"older IPMs. This novel result of theory of functions spaces generalizes classical interpolation inequalities to the case where the measures involved have densities on different manifolds.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Precision-based designs for sequential randomized experiments Strang Splitting for Parametric Inference in Second-order Stochastic Differential Equations Stability of a Generalized Debiased Lasso with Applications to Resampling-Based Variable Selection Tuning parameter selection in econometrics Limiting Behavior of Maxima under Dependence
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1