{"title":"Density Estimation of Processes with Memory via Donsker Vardhan","authors":"Ziv Aharoni, Dor Tsur, H. Permuter","doi":"10.1109/ISIT50566.2022.9834775","DOIUrl":null,"url":null,"abstract":"Density estimation plays an important role in modeling random variables (RVs) with continuous alphabets. This work provides an algorithm that estimates the probability density function (PDF) of stationary and ergodic random processes using recurrent neural networks (RNNs). The main idea is to decompose the target PDF into a known auxiliary PDF and a likelihood ratio between the target and auxiliary PDFs. The algorithm focuses on estimating the likelihood ratio using the Donsker Vardhan (DV) variational formula of Kullback Leibler (KL) divergence. Together, the maximizer of the DV formula and the auxiliary PDF are used to construct the estimator of the target PDF in the form of a Gibbs density. The obtained estimator converges to the target PDF in total variation (TV) and in distribution. Also, we show that proposed estimator minimizes the cross entropy (CE) between the target and auxiliary distribution, and that with a proper choice of the auxiliary distribution, it defines a tight upper bound on the entropy rate. We demonstrate this approach by estimating the density of a Gaussian hidden Markov model.","PeriodicalId":348168,"journal":{"name":"2022 IEEE International Symposium on Information Theory (ISIT)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Symposium on Information Theory (ISIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT50566.2022.9834775","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Density estimation plays an important role in modeling random variables (RVs) with continuous alphabets. This work provides an algorithm that estimates the probability density function (PDF) of stationary and ergodic random processes using recurrent neural networks (RNNs). The main idea is to decompose the target PDF into a known auxiliary PDF and a likelihood ratio between the target and auxiliary PDFs. The algorithm focuses on estimating the likelihood ratio using the Donsker Vardhan (DV) variational formula of Kullback Leibler (KL) divergence. Together, the maximizer of the DV formula and the auxiliary PDF are used to construct the estimator of the target PDF in the form of a Gibbs density. The obtained estimator converges to the target PDF in total variation (TV) and in distribution. Also, we show that proposed estimator minimizes the cross entropy (CE) between the target and auxiliary distribution, and that with a proper choice of the auxiliary distribution, it defines a tight upper bound on the entropy rate. We demonstrate this approach by estimating the density of a Gaussian hidden Markov model.