{"title":"Variants of Mixtures: Information Properties and Applications","authors":"Omid M. Ardakani, M. Asadi, N. Ebrahimi, E. Soofi","doi":"10.52547/jirss.20.1.27","DOIUrl":null,"url":null,"abstract":"In recent years, we have studied information properties of various types of mixtures of probability distributions and introduced a new type, which includes previously known mixtures as special cases. These studies are disseminated in di ff erent fields: reliability engineering, econometrics, operations research, probability, the information theory, and data mining. This paper presents a holistic view of these studies and provides further insights and examples. We note that the insightful probabilistic formulation of the mixing parameters stipulated by Behboodian (1972) is required for a representation of the well-known information measure of the arithmetic mixture. Applications of this information measure presented in this paper include lifetime modeling, system reliability, measuring uncertainty and disagreement of forecasters, probability modeling with partial information, and information loss of kernel estimation. Probabilistic formulations of the mixing weights for various types of mixtures provide the Bayes-Fisher information and the Bayes risk of the mean residual function. MSC: 62B10, 62C05, 60E05, 60E15, 62N05, 94A15, 94A17. (CDF, PDF, SF, HR, MR, OR). The study of information properties of various types of mixtures involves assortments of information and divergence measures: Shannon entropy, KL, JS, Je ff reys, Chi-square, Rényi, Tsallis, and Je ff reys type symmetrized Tsallis divergences, Fisher information measure and Fisher information distance, KL type divergence between SFs, and expected L 1 -norm between cumulative hazards. Areas of applications covered include reliability (comparison of systems), econometrics (uncertainty and disagreements of forecasters), statistics (kernel estimation, exponential family, comparison of two normal means), and nonextensive statistical mechanics (escort distributions).","PeriodicalId":42965,"journal":{"name":"JIRSS-Journal of the Iranian Statistical Society","volume":" ","pages":""},"PeriodicalIF":0.1000,"publicationDate":"2021-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JIRSS-Journal of the Iranian Statistical Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.52547/jirss.20.1.27","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 2
Abstract
In recent years, we have studied information properties of various types of mixtures of probability distributions and introduced a new type, which includes previously known mixtures as special cases. These studies are disseminated in di ff erent fields: reliability engineering, econometrics, operations research, probability, the information theory, and data mining. This paper presents a holistic view of these studies and provides further insights and examples. We note that the insightful probabilistic formulation of the mixing parameters stipulated by Behboodian (1972) is required for a representation of the well-known information measure of the arithmetic mixture. Applications of this information measure presented in this paper include lifetime modeling, system reliability, measuring uncertainty and disagreement of forecasters, probability modeling with partial information, and information loss of kernel estimation. Probabilistic formulations of the mixing weights for various types of mixtures provide the Bayes-Fisher information and the Bayes risk of the mean residual function. MSC: 62B10, 62C05, 60E05, 60E15, 62N05, 94A15, 94A17. (CDF, PDF, SF, HR, MR, OR). The study of information properties of various types of mixtures involves assortments of information and divergence measures: Shannon entropy, KL, JS, Je ff reys, Chi-square, Rényi, Tsallis, and Je ff reys type symmetrized Tsallis divergences, Fisher information measure and Fisher information distance, KL type divergence between SFs, and expected L 1 -norm between cumulative hazards. Areas of applications covered include reliability (comparison of systems), econometrics (uncertainty and disagreements of forecasters), statistics (kernel estimation, exponential family, comparison of two normal means), and nonextensive statistical mechanics (escort distributions).