Abdelrahman Marconi, A. H. Elghandour, Ashraf D. Elbayoumy, Amr Abdelaziz
{"title":"混合高斯分布的微分熵的严格下限","authors":"Abdelrahman Marconi, A. H. Elghandour, Ashraf D. Elbayoumy, Amr Abdelaziz","doi":"10.26636/jtit.2024.2.1444","DOIUrl":null,"url":null,"abstract":"In this paper, a tight lower bound for the differential entropy of the Gaussian mixture model is presented. First, the probability model of mixed Gaussian distribution that is created by mixing both discrete and continuous random variables is investigated in order to represent symmetric bimodal Gaussian distribution using the hyperbolic cosine function, on which a tighter upper bound is set. Then, this tight upper bound is used to derive a tight lower bound for the differential entropy of the Gaussian mixture model introduced. The proposed lower bound allows to maintain its tightness over the entire range of the model's parameters and shows more tightness when compared with other bounds that lose their tightness over certain parameter ranges. The presented results are then extended to introduce a more general tight lower bound for asymmetric bimodal Gaussian distribution, in which the two modes have a symmetric mean but differ in terms of their weights.","PeriodicalId":38425,"journal":{"name":"Journal of Telecommunications and Information Technology","volume":"55 2","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Tight Lower Bound on Differential Entropy for Mixed Gaussian Distributions\",\"authors\":\"Abdelrahman Marconi, A. H. Elghandour, Ashraf D. Elbayoumy, Amr Abdelaziz\",\"doi\":\"10.26636/jtit.2024.2.1444\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, a tight lower bound for the differential entropy of the Gaussian mixture model is presented. First, the probability model of mixed Gaussian distribution that is created by mixing both discrete and continuous random variables is investigated in order to represent symmetric bimodal Gaussian distribution using the hyperbolic cosine function, on which a tighter upper bound is set. Then, this tight upper bound is used to derive a tight lower bound for the differential entropy of the Gaussian mixture model introduced. The proposed lower bound allows to maintain its tightness over the entire range of the model's parameters and shows more tightness when compared with other bounds that lose their tightness over certain parameter ranges. The presented results are then extended to introduce a more general tight lower bound for asymmetric bimodal Gaussian distribution, in which the two modes have a symmetric mean but differ in terms of their weights.\",\"PeriodicalId\":38425,\"journal\":{\"name\":\"Journal of Telecommunications and Information Technology\",\"volume\":\"55 2\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Telecommunications and Information Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.26636/jtit.2024.2.1444\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"Engineering\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Telecommunications and Information Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.26636/jtit.2024.2.1444","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Engineering","Score":null,"Total":0}
Tight Lower Bound on Differential Entropy for Mixed Gaussian Distributions
In this paper, a tight lower bound for the differential entropy of the Gaussian mixture model is presented. First, the probability model of mixed Gaussian distribution that is created by mixing both discrete and continuous random variables is investigated in order to represent symmetric bimodal Gaussian distribution using the hyperbolic cosine function, on which a tighter upper bound is set. Then, this tight upper bound is used to derive a tight lower bound for the differential entropy of the Gaussian mixture model introduced. The proposed lower bound allows to maintain its tightness over the entire range of the model's parameters and shows more tightness when compared with other bounds that lose their tightness over certain parameter ranges. The presented results are then extended to introduce a more general tight lower bound for asymmetric bimodal Gaussian distribution, in which the two modes have a symmetric mean but differ in terms of their weights.