{"title":"论累积萨利斯熵","authors":"Thomas Simon, Guillaume Dulac","doi":"10.1007/s10440-023-00620-3","DOIUrl":null,"url":null,"abstract":"<div><p>We investigate the cumulative Tsallis entropy, an information measure recently introduced as a cumulative version of the classical Tsallis differential entropy, which is itself a generalization of the Boltzmann-Gibbs statistics. This functional is here considered as a perturbation of the expected mean residual life via some power weight function. This point of view leads to the introduction of the dual cumulative Tsallis entropy and of two families of coherent risk measures generalizing those built on mean residual life. We characterize the finiteness of the cumulative Tsallis entropy in terms of <span>\\({\\mathcal{L}}_{p}\\)</span>-spaces and show how they determine the underlying distribution. The range of the functional is exactly described under various constraints, with optimal bounds improving on all those previously available in the literature. Whereas the maximization of the Tsallis differential entropy gives rise to the classical <span>\\(q\\)</span>-Gaussian distribution which is a generalization of the Gaussian having a finite range or heavy tails, the maximization of the cumulative Tsallis entropy leads to an analogous perturbation of the Logistic distribution.</p></div>","PeriodicalId":53132,"journal":{"name":"Acta Applicandae Mathematicae","volume":"188 1","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2023-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"On Cumulative Tsallis Entropies\",\"authors\":\"Thomas Simon, Guillaume Dulac\",\"doi\":\"10.1007/s10440-023-00620-3\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>We investigate the cumulative Tsallis entropy, an information measure recently introduced as a cumulative version of the classical Tsallis differential entropy, which is itself a generalization of the Boltzmann-Gibbs statistics. This functional is here considered as a perturbation of the expected mean residual life via some power weight function. This point of view leads to the introduction of the dual cumulative Tsallis entropy and of two families of coherent risk measures generalizing those built on mean residual life. We characterize the finiteness of the cumulative Tsallis entropy in terms of <span>\\\\({\\\\mathcal{L}}_{p}\\\\)</span>-spaces and show how they determine the underlying distribution. The range of the functional is exactly described under various constraints, with optimal bounds improving on all those previously available in the literature. Whereas the maximization of the Tsallis differential entropy gives rise to the classical <span>\\\\(q\\\\)</span>-Gaussian distribution which is a generalization of the Gaussian having a finite range or heavy tails, the maximization of the cumulative Tsallis entropy leads to an analogous perturbation of the Logistic distribution.</p></div>\",\"PeriodicalId\":53132,\"journal\":{\"name\":\"Acta Applicandae Mathematicae\",\"volume\":\"188 1\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2023-11-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Acta Applicandae Mathematicae\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10440-023-00620-3\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Acta Applicandae Mathematicae","FirstCategoryId":"100","ListUrlMain":"https://link.springer.com/article/10.1007/s10440-023-00620-3","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
We investigate the cumulative Tsallis entropy, an information measure recently introduced as a cumulative version of the classical Tsallis differential entropy, which is itself a generalization of the Boltzmann-Gibbs statistics. This functional is here considered as a perturbation of the expected mean residual life via some power weight function. This point of view leads to the introduction of the dual cumulative Tsallis entropy and of two families of coherent risk measures generalizing those built on mean residual life. We characterize the finiteness of the cumulative Tsallis entropy in terms of \({\mathcal{L}}_{p}\)-spaces and show how they determine the underlying distribution. The range of the functional is exactly described under various constraints, with optimal bounds improving on all those previously available in the literature. Whereas the maximization of the Tsallis differential entropy gives rise to the classical \(q\)-Gaussian distribution which is a generalization of the Gaussian having a finite range or heavy tails, the maximization of the cumulative Tsallis entropy leads to an analogous perturbation of the Logistic distribution.
期刊介绍:
Acta Applicandae Mathematicae is devoted to the art and techniques of applying mathematics and the development of new, applicable mathematical methods.
Covering a large spectrum from modeling to qualitative analysis and computational methods, Acta Applicandae Mathematicae contains papers on different aspects of the relationship between theory and applications, ranging from descriptive papers on actual applications meeting contemporary mathematical standards to proofs of new and deep theorems in applied mathematics.