Arthur Stéphanovitch, Eddie Aamari, Clément Levrard
{"title":"Wasserstein gan是极大极小最优分布估计","authors":"Arthur Stéphanovitch, Eddie Aamari, Clément Levrard","doi":"arxiv-2311.18613","DOIUrl":null,"url":null,"abstract":"We provide non asymptotic rates of convergence of the Wasserstein Generative\nAdversarial networks (WGAN) estimator. We build neural networks classes\nrepresenting the generators and discriminators which yield a GAN that achieves\nthe minimax optimal rate for estimating a certain probability measure $\\mu$\nwith support in $\\mathbb{R}^p$. The probability $\\mu$ is considered to be the\npush forward of the Lebesgue measure on the $d$-dimensional torus\n$\\mathbb{T}^d$ by a map $g^\\star:\\mathbb{T}^d\\rightarrow \\mathbb{R}^p$ of\nsmoothness $\\beta+1$. Measuring the error with the $\\gamma$-H\\\"older Integral\nProbability Metric (IPM), we obtain up to logarithmic factors, the minimax\noptimal rate $O(n^{-\\frac{\\beta+\\gamma}{2\\beta +d}}\\vee n^{-\\frac{1}{2}})$\nwhere $n$ is the sample size, $\\beta$ determines the smoothness of the target\nmeasure $\\mu$, $\\gamma$ is the smoothness of the IPM ($\\gamma=1$ is the\nWasserstein case) and $d\\leq p$ is the intrinsic dimension of $\\mu$. In the\nprocess, we derive a sharp interpolation inequality between H\\\"older IPMs. This\nnovel result of theory of functions spaces generalizes classical interpolation\ninequalities to the case where the measures involved have densities on\ndifferent manifolds.","PeriodicalId":501330,"journal":{"name":"arXiv - MATH - Statistics Theory","volume":"90 4","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Wasserstein GANs are Minimax Optimal Distribution Estimators\",\"authors\":\"Arthur Stéphanovitch, Eddie Aamari, Clément Levrard\",\"doi\":\"arxiv-2311.18613\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We provide non asymptotic rates of convergence of the Wasserstein Generative\\nAdversarial networks (WGAN) estimator. We build neural networks classes\\nrepresenting the generators and discriminators which yield a GAN that achieves\\nthe minimax optimal rate for estimating a certain probability measure $\\\\mu$\\nwith support in $\\\\mathbb{R}^p$. The probability $\\\\mu$ is considered to be the\\npush forward of the Lebesgue measure on the $d$-dimensional torus\\n$\\\\mathbb{T}^d$ by a map $g^\\\\star:\\\\mathbb{T}^d\\\\rightarrow \\\\mathbb{R}^p$ of\\nsmoothness $\\\\beta+1$. Measuring the error with the $\\\\gamma$-H\\\\\\\"older Integral\\nProbability Metric (IPM), we obtain up to logarithmic factors, the minimax\\noptimal rate $O(n^{-\\\\frac{\\\\beta+\\\\gamma}{2\\\\beta +d}}\\\\vee n^{-\\\\frac{1}{2}})$\\nwhere $n$ is the sample size, $\\\\beta$ determines the smoothness of the target\\nmeasure $\\\\mu$, $\\\\gamma$ is the smoothness of the IPM ($\\\\gamma=1$ is the\\nWasserstein case) and $d\\\\leq p$ is the intrinsic dimension of $\\\\mu$. In the\\nprocess, we derive a sharp interpolation inequality between H\\\\\\\"older IPMs. This\\nnovel result of theory of functions spaces generalizes classical interpolation\\ninequalities to the case where the measures involved have densities on\\ndifferent manifolds.\",\"PeriodicalId\":501330,\"journal\":{\"name\":\"arXiv - MATH - Statistics Theory\",\"volume\":\"90 4\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Statistics Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2311.18613\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2311.18613","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Wasserstein GANs are Minimax Optimal Distribution Estimators
We provide non asymptotic rates of convergence of the Wasserstein Generative
Adversarial networks (WGAN) estimator. We build neural networks classes
representing the generators and discriminators which yield a GAN that achieves
the minimax optimal rate for estimating a certain probability measure $\mu$
with support in $\mathbb{R}^p$. The probability $\mu$ is considered to be the
push forward of the Lebesgue measure on the $d$-dimensional torus
$\mathbb{T}^d$ by a map $g^\star:\mathbb{T}^d\rightarrow \mathbb{R}^p$ of
smoothness $\beta+1$. Measuring the error with the $\gamma$-H\"older Integral
Probability Metric (IPM), we obtain up to logarithmic factors, the minimax
optimal rate $O(n^{-\frac{\beta+\gamma}{2\beta +d}}\vee n^{-\frac{1}{2}})$
where $n$ is the sample size, $\beta$ determines the smoothness of the target
measure $\mu$, $\gamma$ is the smoothness of the IPM ($\gamma=1$ is the
Wasserstein case) and $d\leq p$ is the intrinsic dimension of $\mu$. In the
process, we derive a sharp interpolation inequality between H\"older IPMs. This
novel result of theory of functions spaces generalizes classical interpolation
inequalities to the case where the measures involved have densities on
different manifolds.