首页 > 最新文献

Statistical Science最新文献

英文 中文
The Costs and Benefits of Uniformly Valid Causal Inference with High-Dimensional Nuisance Parameters 具有高维妨害参数的一致有效因果推理的成本与收益
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-05-05 DOI: 10.1214/21-sts843
Niloofar Moosavi, J. Haggstrom, X. Luna
Important advances have recently been achieved in developing procedures yielding uniformly valid inference for a low dimensional causal parameter when high-dimensional nuisance models must be estimated. In this paper, we review the literature on uniformly valid causal inference and discuss the costs and benefits of using uniformly valid inference procedures. Naive estimation strategies based on regularisation, machine learning, or a preliminary model selection stage for the nuisance models have finite sample distributions which are badly approximated by their asymptotic distributions. To solve this serious problem, estimators which converge uniformly in distribution over a class of data generating mechanisms have been proposed in the literature. In order to obtain uniformly valid results in high-dimensional situations, sparsity conditions for the nuisance models need typically to be made, although a double robustness property holds, whereby if one of the nuisance model is more sparse, the other nuisance model is allowed to be less sparse. While uniformly valid inference is a highly desirable property, uniformly valid procedures pay a high price in terms of inflated variability. Our discussion of this dilemma is illustrated by the study of a double-selection outcome regression estimator, which we show is uniformly asymptotically unbiased, but is less variable than uniformly valid estimators in the numerical experiments conducted.
最近,在开发程序方面取得了重要进展,当必须估计高维滋扰模型时,可以对低维因果参数进行一致有效的推断。在本文中,我们回顾了一致有效因果推理的文献,并讨论了使用一致有效推理程序的成本和收益。基于正则化、机器学习或滋扰模型的初步模型选择阶段的天真估计策略具有有限的样本分布,其渐近分布非常接近。为了解决这个严重的问题,文献中提出了在一类数据生成机制上分布一致收敛的估计量。为了在高维情况下获得一致有效的结果,通常需要为滋扰模型设定稀疏性条件,尽管具有双重鲁棒性,由此,如果滋扰模型中的一个更稀疏,则允许另一个滋扰模型不那么稀疏。虽然一致有效推理是一种非常理想的性质,但一致有效程序在膨胀的可变性方面付出了高昂的代价。我们对这一困境的讨论通过对双重选择结果回归估计量的研究来说明,我们证明了该估计量是一致渐近无偏的,但与数值实验中的一致有效估计量相比,其变量较小。
{"title":"The Costs and Benefits of Uniformly Valid Causal Inference with High-Dimensional Nuisance Parameters","authors":"Niloofar Moosavi, J. Haggstrom, X. Luna","doi":"10.1214/21-sts843","DOIUrl":"https://doi.org/10.1214/21-sts843","url":null,"abstract":"Important advances have recently been achieved in developing procedures yielding uniformly valid inference for a low dimensional causal parameter when high-dimensional nuisance models must be estimated. In this paper, we review the literature on uniformly valid causal inference and discuss the costs and benefits of using uniformly valid inference procedures. Naive estimation strategies based on regularisation, machine learning, or a preliminary model selection stage for the nuisance models have finite sample distributions which are badly approximated by their asymptotic distributions. To solve this serious problem, estimators which converge uniformly in distribution over a class of data generating mechanisms have been proposed in the literature. In order to obtain uniformly valid results in high-dimensional situations, sparsity conditions for the nuisance models need typically to be made, although a double robustness property holds, whereby if one of the nuisance model is more sparse, the other nuisance model is allowed to be less sparse. While uniformly valid inference is a highly desirable property, uniformly valid procedures pay a high price in terms of inflated variability. Our discussion of this dilemma is illustrated by the study of a double-selection outcome regression estimator, which we show is uniformly asymptotically unbiased, but is less variable than uniformly valid estimators in the numerical experiments conducted.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48898490","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Comment: Settle the Unsettling: An Inferential Models Perspective 评论:解决不安:一个推理模型的视角
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-05-01 DOI: 10.1214/21-STS765B
Chuanhai Liu, Ryan Martin
Here, we demonstrate that the inferential model (IM) framework, unlike the updating rules that Gong and Meng show to be unreliable, provides valid and efficient inferences/prediction while not being susceptible to sure loss. In this sense, the IM framework settles what Gong and Meng characterized as “unsettling.”
在这里,我们证明了推理模型(IM)框架与龚和孟所展示的不可靠的更新规则不同,它提供了有效的推断/预测,同时不易受到确定性损失的影响。从这个意义上说,IM框架解决了龚和孟所说的“令人不安”
{"title":"Comment: Settle the Unsettling: An Inferential Models Perspective","authors":"Chuanhai Liu, Ryan Martin","doi":"10.1214/21-STS765B","DOIUrl":"https://doi.org/10.1214/21-STS765B","url":null,"abstract":"Here, we demonstrate that the inferential model (IM) framework, unlike the updating rules that Gong and Meng show to be unreliable, provides valid and efficient inferences/prediction while not being susceptible to sure loss. In this sense, the IM framework settles what Gong and Meng characterized as “unsettling.”","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41743877","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
The Box–Cox Transformation: Review and Extensions Box-Cox转换:回顾和扩展
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-05-01 DOI: 10.1214/20-STS778
A. Atkinson, M. Riani, A. Corbellini
The Box-Cox power transformation family for non-negative responses in linear models has a long and interesting history in both statistical practice and theory, which we summarize. The relationship between generalized linear models and log transformed data is illustrated. Extensions investigated include the transform both sides model and the Yeo-Johnson transformation for observations that can be positive or negative. The paper also describes an extended Yeo-Johnson transformation that allows positive and negative responses to have different power transformations. Analyses of data show this to be necessary. Robustness enters in the fan plot for which the forward search provides an ordering of the data. Plausible transformations are checked with an extended fan plot. These procedures are used to compare parametric power transformations with nonparametric transformations produced by smoothing.
线性模型中非负响应的Box-Cox功率变换族在统计实践和理论方面都有着悠久而有趣的历史,我们对此进行了总结。说明了广义线性模型与对数变换数据之间的关系。所研究的扩展包括变换两侧模型和杨-约翰逊变换,可以是正的或负的观测。本文还描述了一个允许正响应和负响应具有不同功率变换的扩展Yeo-Johnson变换。数据分析表明,这是必要的。鲁棒性进入扇形图,其中正向搜索提供了数据的排序。合理的转换用扩展的扇形图进行检验。这些程序用于比较参数幂变换与由平滑产生的非参数变换。
{"title":"The Box–Cox Transformation: Review and Extensions","authors":"A. Atkinson, M. Riani, A. Corbellini","doi":"10.1214/20-STS778","DOIUrl":"https://doi.org/10.1214/20-STS778","url":null,"abstract":"The Box-Cox power transformation family for non-negative responses in linear models has a long and interesting history in both statistical practice and theory, which we summarize. The relationship between generalized linear models and log transformed data is illustrated. Extensions investigated include the transform both sides model and the Yeo-Johnson transformation for observations that can be positive or negative. The paper also describes an extended Yeo-Johnson transformation that allows positive and negative responses to have different power transformations. Analyses of data show this to be necessary. Robustness enters in the fan plot for which the forward search provides an ordering of the data. Plausible transformations are checked with an extended fan plot. These procedures are used to compare parametric power transformations with nonparametric transformations produced by smoothing.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48376839","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 44
A selective overview of deep learning. 深度学习的选择性概述。
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-05-01 Epub Date: 2020-04-19 DOI: 10.1214/20-sts783
Jianqing Fan, Cong Ma, Yiqiao Zhong

Deep learning has achieved tremendous success in recent years. In simple words, deep learning uses the composition of many nonlinear functions to model the complex dependency between input features and labels. While neural networks have a long history, recent advances have greatly improved their performance in computer vision, natural language processing, etc. From the statistical and scientific perspective, it is natural to ask: What is deep learning? What are the new characteristics of deep learning, compared with classical methods? What are the theoretical foundations of deep learning? To answer these questions, we introduce common neural network models (e.g., convolutional neural nets, recurrent neural nets, generative adversarial nets) and training techniques (e.g., stochastic gradient descent, dropout, batch normalization) from a statistical point of view. Along the way, we highlight new characteristics of deep learning (including depth and over-parametrization) and explain their practical and theoretical benefits. We also sample recent results on theories of deep learning, many of which are only suggestive. While a complete understanding of deep learning remains elusive, we hope that our perspectives and discussions serve as a stimulus for new statistical research.

近年来,深度学习取得了巨大成功。简单地说,深度学习使用许多非线性函数的组合来模拟输入特征和标签之间的复杂依赖关系。虽然神经网络有着悠久的历史,但近年来的进步大大提高了其在计算机视觉、自然语言处理等方面的性能。从统计学和科学的角度看,我们自然会问:什么是深度学习?与经典方法相比,深度学习有哪些新特点?深度学习的理论基础是什么?为了回答这些问题,我们从统计学的角度介绍了常见的神经网络模型(如卷积神经网络、递归神经网络、生成对抗网络)和训练技术(如随机梯度下降、丢弃、批量归一化)。在此过程中,我们强调了深度学习的新特点(包括深度和过参数化),并解释了它们在实践和理论上的益处。我们还列举了有关深度学习理论的最新成果,其中许多成果都只是建议性的。虽然对深度学习的全面理解仍遥不可及,但我们希望我们的观点和讨论能对新的统计研究起到激励作用。
{"title":"A selective overview of deep learning.","authors":"Jianqing Fan, Cong Ma, Yiqiao Zhong","doi":"10.1214/20-sts783","DOIUrl":"10.1214/20-sts783","url":null,"abstract":"<p><p>Deep learning has achieved tremendous success in recent years. In simple words, deep learning uses the composition of many nonlinear functions to model the complex dependency between input features and labels. While neural networks have a long history, recent advances have greatly improved their performance in computer vision, natural language processing, etc. From the statistical and scientific perspective, it is natural to ask: What is deep learning? What are the new characteristics of deep learning, compared with classical methods? What are the theoretical foundations of deep learning? To answer these questions, we introduce common neural network models (e.g., convolutional neural nets, recurrent neural nets, generative adversarial nets) and training techniques (e.g., stochastic gradient descent, dropout, batch normalization) from a statistical point of view. Along the way, we highlight new characteristics of deep learning (including depth and over-parametrization) and explain their practical and theoretical benefits. We also sample recent results on theories of deep learning, many of which are only suggestive. While a complete understanding of deep learning remains elusive, we hope that our perspectives and discussions serve as a stimulus for new statistical research.</p>","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":"36 2","pages":"264-290"},"PeriodicalIF":5.7,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8300482/pdf/nihms-1639566.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39219267","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Robust high dimensional factor models with applications to statistical machine learning. 鲁棒高维因子模型及其在统计机器学习中的应用。
IF 3.9 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-05-01 Epub Date: 2021-04-19 DOI: 10.1214/20-sts785
Jianqing Fan, Kaizheng Wang, Yiqiao Zhong, Ziwei Zhu

Factor models are a class of powerful statistical models that have been widely used to deal with dependent measurements that arise frequently from various applications from genomics and neuroscience to economics and finance. As data are collected at an ever-growing scale, statistical machine learning faces some new challenges: high dimensionality, strong dependence among observed variables, heavy-tailed variables and heterogeneity. High-dimensional robust factor analysis serves as a powerful toolkit to conquer these challenges. This paper gives a selective overview on recent advance on high-dimensional factor models and their applications to statistics including Factor-Adjusted Robust Model selection (FarmSelect) and Factor-Adjusted Robust Multiple testing (FarmTest). We show that classical methods, especially principal component analysis (PCA), can be tailored to many new problems and provide powerful tools for statistical estimation and inference. We highlight PCA and its connections to matrix perturbation theory, robust statistics, random projection, false discovery rate, etc., and illustrate through several applications how insights from these fields yield solutions to modern challenges. We also present far-reaching connections between factor models and popular statistical learning problems, including network analysis and low-rank matrix recovery.

因子模型是一类强大的统计模型,已被广泛用于处理从基因组学、神经科学到经济学和金融学的各种应用中经常出现的依赖性测量。随着数据的收集规模不断扩大,统计机器学习面临着一些新的挑战:高维度、观测变量之间的强依赖性、重尾变量和异质性。高维稳健因子分析是克服这些挑战的强大工具。本文选择性地综述了高维因子模型的最新进展及其在统计学中的应用,包括因子调整稳健模型选择(FarmSelect)和因子调整稳健多重检验(FarmTest)。我们表明,经典方法,特别是主成分分析(PCA),可以针对许多新问题进行调整,并为统计估计和推理提供强大的工具。我们强调了主成分分析及其与矩阵扰动理论、稳健统计、随机投影、错误发现率等的联系,并通过几个应用程序说明了这些领域的见解如何为现代挑战提供解决方案。我们还提出了因子模型与流行的统计学习问题之间的深远联系,包括网络分析和低秩矩阵恢复。
{"title":"Robust high dimensional factor models with applications to statistical machine learning.","authors":"Jianqing Fan, Kaizheng Wang, Yiqiao Zhong, Ziwei Zhu","doi":"10.1214/20-sts785","DOIUrl":"10.1214/20-sts785","url":null,"abstract":"<p><p>Factor models are a class of powerful statistical models that have been widely used to deal with dependent measurements that arise frequently from various applications from genomics and neuroscience to economics and finance. As data are collected at an ever-growing scale, statistical machine learning faces some new challenges: high dimensionality, strong dependence among observed variables, heavy-tailed variables and heterogeneity. High-dimensional robust factor analysis serves as a powerful toolkit to conquer these challenges. This paper gives a selective overview on recent advance on high-dimensional factor models and their applications to statistics including Factor-Adjusted Robust Model selection (FarmSelect) and Factor-Adjusted Robust Multiple testing (FarmTest). We show that classical methods, especially principal component analysis (PCA), can be tailored to many new problems and provide powerful tools for statistical estimation and inference. We highlight PCA and its connections to matrix perturbation theory, robust statistics, random projection, false discovery rate, etc., and illustrate through several applications how insights from these fields yield solutions to modern challenges. We also present far-reaching connections between factor models and popular statistical learning problems, including network analysis and low-rank matrix recovery.</p>","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":"36 2","pages":"303-327"},"PeriodicalIF":3.9,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8315369/pdf/nihms-1639567.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39254018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Comment: On the History and Limitations of Probability Updating 评论:概率更新的历史和局限性
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-04-01 DOI: 10.1214/21-STS765A
G. Shafer
Gong and Meng show that we can gain insights into classical paradoxes about conditional probability by acknowledging that apparently precise probabilities live within a larger world of imprecise probability. They also show that the notion of updating becomes problematic in this larger world. A closer look at the historical development of the notion of updating can give us further insights into its limitations.
Gong和孟表明,我们可以通过承认表面上精确的概率存在于一个更大的不精确概率世界中,从而深入了解关于条件概率的经典悖论。它们还表明,在这个更大的世界里,更新的概念变得有问题。仔细观察更新概念的历史发展可以让我们进一步了解它的局限性。
{"title":"Comment: On the History and Limitations of Probability Updating","authors":"G. Shafer","doi":"10.1214/21-STS765A","DOIUrl":"https://doi.org/10.1214/21-STS765A","url":null,"abstract":"Gong and Meng show that we can gain insights into classical paradoxes about conditional probability by acknowledging that apparently precise probabilities live within a larger world of imprecise probability. They also show that the notion of updating becomes problematic in this larger world. A closer look at the historical development of the notion of updating can give us further insights into its limitations.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42864081","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Comment: Moving Beyond Sets of Probabilities 评论:超越概率集
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-04-01 DOI: 10.1214/21-STS765C
G. Wheeler
The theory of lower previsions is designed around the principles of coherence and sure-loss avoidance, thus steers clear of all the updating anomalies highlighted in Gong and Meng’s “Judicious Judgment Meets Unsettling Updating: Dilation, Sure Loss and Simpson’s Paradox” except dilation. In fact, the traditional problem with the theory of imprecise probability is that coherent inference is too complicated rather than unsettling. Progress has been made simplifying coherent inference by demoting sets of probabilities from fundamental building blocks to secondary representations that are derived or discarded as needed.
较低预测理论是围绕连贯性和确定性损失避免原则设计的,从而避开了龚和孟《判断的公正性遇到了令人不安的更新:扩张、确定性损失和辛普森悖论》中强调的除扩张之外的所有更新异常。事实上,不精确概率理论的传统问题是,连贯推理过于复杂,而不是令人不安。通过将概率集从基本构建块降级为根据需要导出或丢弃的二次表示,已经在简化相干推理方面取得了进展。
{"title":"Comment: Moving Beyond Sets of Probabilities","authors":"G. Wheeler","doi":"10.1214/21-STS765C","DOIUrl":"https://doi.org/10.1214/21-STS765C","url":null,"abstract":"The theory of lower previsions is designed around the principles of coherence and sure-loss avoidance, thus steers clear of all the updating anomalies highlighted in Gong and Meng’s “Judicious Judgment Meets Unsettling Updating: Dilation, Sure Loss and Simpson’s Paradox” except dilation. In fact, the traditional problem with the theory of imprecise probability is that coherent inference is too complicated rather than unsettling. Progress has been made simplifying coherent inference by demoting sets of probabilities from fundamental building blocks to secondary representations that are derived or discarded as needed.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45118344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Stochastic Approximation: From Statistical Origin to Big-Data, Multidisciplinary Applications 随机逼近:从统计起源到大数据,多学科应用
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-04-01 DOI: 10.1214/20-STS784
T. Lai, Hongsong Yuan
Stochastic approximation was introduced in 1951 to provide a new theoretical framework for root finding and optimization of a regression function in the then-nascent field of statistics. This review shows how it has evolved in response to other developments in statistics, notably time series and sequential analysis, and to applications in artificial intelligence, economics, and engineering. Its resurgence in the Big Data Era has led to new advances in both theory and applications of this microcosm of statistics and data science.
随机近似于1951年被引入,为回归函数的寻根和优化提供了一个新的理论框架。这篇综述展示了它是如何随着统计学的其他发展而发展的,特别是时间序列和序列分析,以及它在人工智能、经济学和工程学中的应用。它在大数据时代的复苏导致了这一统计学和数据科学微观世界的理论和应用的新进展。
{"title":"Stochastic Approximation: From Statistical Origin to Big-Data, Multidisciplinary Applications","authors":"T. Lai, Hongsong Yuan","doi":"10.1214/20-STS784","DOIUrl":"https://doi.org/10.1214/20-STS784","url":null,"abstract":"Stochastic approximation was introduced in 1951 to provide a new theoretical framework for root finding and optimization of a regression function in the then-nascent field of statistics. This review shows how it has evolved in response to other developments in statistics, notably time series and sequential analysis, and to applications in artificial intelligence, economics, and engineering. Its resurgence in the Big Data Era has led to new advances in both theory and applications of this microcosm of statistics and data science.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43506417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Noncommutative Probability and Multiplicative Cascades 非交换概率与乘级联
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-04-01 DOI: 10.1214/20-STS780
I. McKeague
Various aspects of standard model particle physics might be explained by a suitably rich algebra acting on itself, as suggested by Furey (2015). The present paper develops the asymptotics of large causal tree diagrams that combine freely independent elements in such an algebra. The Marčenko–Pastur law and Wigner’s semicircle law are shown to emerge as limits of normalized sum-over-paths of nonnegative elements assigned to the edges of causal trees. These results are established in the setting of noncommutative probability. Trees with classically independent positive edge weights (random multiplicative cascades) were originally proposed by Mandelbrot as a model displaying the fractal features of turbulence. The novelty of the present work is the use of noncommutative (free) probability to allow the edge weights to take values in an algebra. An application to theoretical neuroscience is also discussed.
正如Furey(2015)所建议的那样,标准模型粒子物理的各个方面可以用一个适当丰富的代数来解释。本文发展了大型因果树图的渐近性,这些因果树图在这样的代数中结合了自由独立的元素。Marčenko–Pastur定律和Wigner半圆定律被证明是在分配给因果树边缘的非负元素的路径上的归一化和的极限。这些结果是在非对易概率的情况下建立的。具有经典独立正边权的树(随机乘法级联)最初由Mandelbrot提出,作为显示湍流分形特征的模型。本工作的新颖之处在于使用非对易(自由)概率来允许边缘权重取代数中的值。还讨论了它在理论神经科学中的应用。
{"title":"Noncommutative Probability and Multiplicative Cascades","authors":"I. McKeague","doi":"10.1214/20-STS780","DOIUrl":"https://doi.org/10.1214/20-STS780","url":null,"abstract":"Various aspects of standard model particle physics might be explained by a suitably rich algebra acting on itself, as suggested by Furey (2015). The present paper develops the asymptotics of large causal tree diagrams that combine freely independent elements in such an algebra. The Marčenko–Pastur law and Wigner’s semicircle law are shown to emerge as limits of normalized sum-over-paths of nonnegative elements assigned to the edges of causal trees. These results are established in the setting of noncommutative probability. Trees with classically independent positive edge weights (random multiplicative cascades) were originally proposed by Mandelbrot as a model displaying the fractal features of turbulence. The novelty of the present work is the use of noncommutative (free) probability to allow the edge weights to take values in an algebra. An application to theoretical neuroscience is also discussed.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43723597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Comment: On Focusing, Soft and Strong Revision of Choquet Capacities and Their Role in Statistics 评论:关于Choquet能力的集中、软和强修正及其在统计中的作用
IF 5.7 1区 数学 Q1 STATISTICS & PROBABILITY Pub Date : 2021-04-01 DOI: 10.1214/21-STS765D
Thomas Augustin, G. Schollmeyer
We congratulate Ruobin Gong and Xiao-Li Meng on their thought-provoking paper demonstrating the power of imprecise probabilities in statistics. In particular, Gong and Meng clarify important statistical paradoxes by discussing them in the framework of generalized uncertainty quantification and different conditioning rules used for updating. In this note, we characterize all three conditioning rules as envelopes of certain sets of conditional probabilities. This view also suggests some generalizations that can be seen as compromise rules. Similar to Gong and Meng, our derivations mainly focus on Choquet capacities of order 2, and so we also briefly discuss in general their role as statistical models. We conclude with some general remarks on the potential of imprecise probabilities to cope with the multidimensional nature of uncertainty.
我们祝贺龚若彬和李晓丽发表了发人深省的论文,展示了统计中不精确概率的力量。特别是,Gong和孟通过在广义不确定性量化和用于更新的不同条件规则的框架中讨论重要的统计悖论来澄清它们。在本文中,我们将这三个条件规则描述为特定条件概率集的包络。这种观点也提出了一些可以被视为妥协规则的概括。与Gong和孟类似,我们的推导主要集中在2阶的Choquet容量上,因此我们也简要讨论了它们作为统计模型的一般作用。最后,我们对处理不确定性的多维性的不精确概率的潜力作了一些一般性评论。
{"title":"Comment: On Focusing, Soft and Strong Revision of Choquet Capacities and Their Role in Statistics","authors":"Thomas Augustin, G. Schollmeyer","doi":"10.1214/21-STS765D","DOIUrl":"https://doi.org/10.1214/21-STS765D","url":null,"abstract":"We congratulate Ruobin Gong and Xiao-Li Meng on their thought-provoking paper demonstrating the power of imprecise probabilities in statistics. In particular, Gong and Meng clarify important statistical paradoxes by discussing them in the framework of generalized uncertainty quantification and different conditioning rules used for updating. In this note, we characterize all three conditioning rules as envelopes of certain sets of conditional probabilities. This view also suggests some generalizations that can be seen as compromise rules. Similar to Gong and Meng, our derivations mainly focus on Choquet capacities of order 2, and so we also briefly discuss in general their role as statistical models. We conclude with some general remarks on the potential of imprecise probabilities to cope with the multidimensional nature of uncertainty.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":5.7,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41573357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
期刊
Statistical Science
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1