{"title":"最小信息联结的适当评分规则","authors":"Yici Chen, Tomonari Sei","doi":"10.1016/j.jmva.2023.105271","DOIUrl":null,"url":null,"abstract":"<div><p><span>Two-dimensional distributions whose marginal distributions are uniform are called bivariate </span>copulas<span>. Among them, the one that satisfies given constraints on expectation and is closest to being an independent distribution in the sense of Kullback–Leibler divergence is called the minimum information bivariate copula. The density function of the minimum information copula contains a set of functions called the normalizing functions, which are often difficult to compute. Although a number of proper scoring rules for probability distributions having normalizing constants such as exponential families have been proposed, these scores are not applicable to the minimum information copulas due to the normalizing functions. In this paper, we propose the conditional Kullback–Leibler score, which avoids computation of the normalizing functions. The main idea of its construction is to use pairs of observations. We show that the proposed score is strictly proper in the space of copula density functions and therefore the estimator derived from it has asymptotic consistency. Furthermore, the score is convex with respect to the parameters and can be easily optimized by the gradient methods.</span></p></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":"201 ","pages":"Article 105271"},"PeriodicalIF":1.4000,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A proper scoring rule for minimum information bivariate copulas\",\"authors\":\"Yici Chen, Tomonari Sei\",\"doi\":\"10.1016/j.jmva.2023.105271\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p><span>Two-dimensional distributions whose marginal distributions are uniform are called bivariate </span>copulas<span>. Among them, the one that satisfies given constraints on expectation and is closest to being an independent distribution in the sense of Kullback–Leibler divergence is called the minimum information bivariate copula. The density function of the minimum information copula contains a set of functions called the normalizing functions, which are often difficult to compute. Although a number of proper scoring rules for probability distributions having normalizing constants such as exponential families have been proposed, these scores are not applicable to the minimum information copulas due to the normalizing functions. In this paper, we propose the conditional Kullback–Leibler score, which avoids computation of the normalizing functions. The main idea of its construction is to use pairs of observations. We show that the proposed score is strictly proper in the space of copula density functions and therefore the estimator derived from it has asymptotic consistency. Furthermore, the score is convex with respect to the parameters and can be easily optimized by the gradient methods.</span></p></div>\",\"PeriodicalId\":16431,\"journal\":{\"name\":\"Journal of Multivariate Analysis\",\"volume\":\"201 \",\"pages\":\"Article 105271\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Multivariate Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0047259X23001173\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X23001173","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
A proper scoring rule for minimum information bivariate copulas
Two-dimensional distributions whose marginal distributions are uniform are called bivariate copulas. Among them, the one that satisfies given constraints on expectation and is closest to being an independent distribution in the sense of Kullback–Leibler divergence is called the minimum information bivariate copula. The density function of the minimum information copula contains a set of functions called the normalizing functions, which are often difficult to compute. Although a number of proper scoring rules for probability distributions having normalizing constants such as exponential families have been proposed, these scores are not applicable to the minimum information copulas due to the normalizing functions. In this paper, we propose the conditional Kullback–Leibler score, which avoids computation of the normalizing functions. The main idea of its construction is to use pairs of observations. We show that the proposed score is strictly proper in the space of copula density functions and therefore the estimator derived from it has asymptotic consistency. Furthermore, the score is convex with respect to the parameters and can be easily optimized by the gradient methods.
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.