{"title":"Concentration and relative entropy for compound Poisson distributions","authors":"M. Madiman, Ioannis Kontoyiannis","doi":"10.1109/ISIT.2005.1523662","DOIUrl":null,"url":null,"abstract":"Using a simple inequality about the relative entropy, its so-called \"tensorization property,\" we give a simple proof of a functional inequality which is satisfied by any compound Poisson distribution. This functional inequality belongs to the class of modified logarithmic Sobolev inequalities. We use it to obtain measure concentration bounds for compound Poisson distributions under a variety of assumptions on their tail behavior. In particular, we show how the celebrated \"Herbst argument\" can be modified to yield sub-exponential concentration bounds. For example, suppose Z is a compound Poisson random variable with values on the nonnegative integers, and let f be a function such that |f(k+1) - f(k)| les 1 for all k. Then, if the base distribution of Z does not have a finite moment-generating function but has finite moments up to some order L > 1, we show that the probability that f(Z) exceeds its mean by a positive amount t or more decays approximately like (const)middott-L, where the constant is explicitly identified. This appears to be one of the very first examples of concentration bounds with power-law decay","PeriodicalId":166130,"journal":{"name":"Proceedings. International Symposium on Information Theory, 2005. ISIT 2005.","volume":"55 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. International Symposium on Information Theory, 2005. ISIT 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT.2005.1523662","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Using a simple inequality about the relative entropy, its so-called "tensorization property," we give a simple proof of a functional inequality which is satisfied by any compound Poisson distribution. This functional inequality belongs to the class of modified logarithmic Sobolev inequalities. We use it to obtain measure concentration bounds for compound Poisson distributions under a variety of assumptions on their tail behavior. In particular, we show how the celebrated "Herbst argument" can be modified to yield sub-exponential concentration bounds. For example, suppose Z is a compound Poisson random variable with values on the nonnegative integers, and let f be a function such that |f(k+1) - f(k)| les 1 for all k. Then, if the base distribution of Z does not have a finite moment-generating function but has finite moments up to some order L > 1, we show that the probability that f(Z) exceeds its mean by a positive amount t or more decays approximately like (const)middott-L, where the constant is explicitly identified. This appears to be one of the very first examples of concentration bounds with power-law decay