首页 > 最新文献

Journal of Econometrics最新文献

英文 中文
Huber Principal Component Analysis for large-dimensional factor models
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-18 DOI: 10.1016/j.jeconom.2025.105993
Yong He , Lingxiao Li , Dong Liu , Wen-Xin Zhou
Factor models have been widely used in economics and finance. However, the heavy-tailed nature of macroeconomic and financial data is often neglected in statistical analysis. To address this issue, we propose a robust approach to estimate factor loadings and scores by minimizing the Huber loss function, which is motivated by the equivalence between conventional Principal Component Analysis (PCA) and the constrained least squares method in the factor model. We provide two algorithms that use different penalty forms. The first algorithm involves an element-wise-type Huber loss minimization, solved by an iterative Huber regression algorithm. The second algorithm, which we refer to as Huber PCA, minimizes the 2-norm-type Huber loss and performs PCA on the weighted sample covariance matrix. We examine the theoretical minimizer of the element-wise Huber loss function and demonstrate that it has the same convergence rate as conventional PCA when the idiosyncratic errors have bounded second moments. We also derive their asymptotic distributions under mild conditions. Moreover, we suggest a consistent model selection criterion that relies on rank minimization to estimate the number of factors robustly. We showcase the benefits of the proposed two algorithms through extensive numerical experiments and a real macroeconomic data example. An R package named “HDRFA1 has been developed to conduct the proposed robust factor analysis.
{"title":"Huber Principal Component Analysis for large-dimensional factor models","authors":"Yong He ,&nbsp;Lingxiao Li ,&nbsp;Dong Liu ,&nbsp;Wen-Xin Zhou","doi":"10.1016/j.jeconom.2025.105993","DOIUrl":"10.1016/j.jeconom.2025.105993","url":null,"abstract":"<div><div>Factor models have been widely used in economics and finance. However, the heavy-tailed nature of macroeconomic and financial data is often neglected in statistical analysis. To address this issue, we propose a robust approach to estimate factor loadings and scores by minimizing the Huber loss function, which is motivated by the equivalence between conventional Principal Component Analysis (PCA) and the constrained least squares method in the factor model. We provide two algorithms that use different penalty forms. The first algorithm involves an element-wise-type Huber loss minimization, solved by an iterative Huber regression algorithm. The second algorithm, which we refer to as Huber PCA, minimizes the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span>-norm-type Huber loss and performs PCA on the weighted sample covariance matrix. We examine the theoretical minimizer of the element-wise Huber loss function and demonstrate that it has the same convergence rate as conventional PCA when the idiosyncratic errors have bounded second moments. We also derive their asymptotic distributions under mild conditions. Moreover, we suggest a consistent model selection criterion that relies on rank minimization to estimate the number of factors robustly. We showcase the benefits of the proposed two algorithms through extensive numerical experiments and a real macroeconomic data example. An <span>R</span> package named “<span>HDRFA</span>” <span><span><sup>1</sup></span></span> has been developed to conduct the proposed robust factor analysis.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105993"},"PeriodicalIF":9.9,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143642831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Limit theory and inference in non-cointegrated functional coefficient regression
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-17 DOI: 10.1016/j.jeconom.2025.105996
Ying Wang , Peter C.B. Phillips , Yundong Tu
Functional coefficient (FC) cointegrating regressions offer empirical investigators flexibility in modeling economic relationships by introducing covariates that influence the direction and intensity of comovement among nonstationary time series. FC regression models are also useful when formal cointegration is absent, in the sense that the equation errors may themselves be nonstationary, but where the nonstationary series display well-defined FC linkages that can be meaningfully interpreted as correlation measures involving the covariates. The present paper proposes new nonparametric estimators for such FC regression models where the nonstationary series display linkages that enable consistent estimation of the correlation measures between them. Specifically, we develop n-consistent estimators for the functional coefficient and establish their asymptotic distributions, which involve mixed normal limits that facilitate inference. Two novel features that appear in the limit theory are (i) the need for non-diagonal matrix normalization due to the presence of stationary and nonstationary components in the regression; and (ii) random bias elements that appear in the asymptotic distribution of the kernel estimators, again resulting from the nonstationary regression components. Numerical studies reveal that the proposed estimators achieve significant efficiency improvements compared to the estimators suggested in earlier work by Sun et al. (2011). Easily implementable specification tests with standard chi-square asymptotics are suggested to check for constancy of the functional coefficient. These tests are shown to have faster divergence rate under local alternatives and enjoy superior performance in simulations than tests proposed in Gan et al. (2014). An empirical application based on the quantity theory of money is included, illustrating the practical use of correlated but non-cointegrated regression relations.
{"title":"Limit theory and inference in non-cointegrated functional coefficient regression","authors":"Ying Wang ,&nbsp;Peter C.B. Phillips ,&nbsp;Yundong Tu","doi":"10.1016/j.jeconom.2025.105996","DOIUrl":"10.1016/j.jeconom.2025.105996","url":null,"abstract":"<div><div>Functional coefficient (FC) cointegrating regressions offer empirical investigators flexibility in modeling economic relationships by introducing covariates that influence the direction and intensity of comovement among nonstationary time series. FC regression models are also useful when formal cointegration is absent, in the sense that the equation errors may themselves be nonstationary, but where the nonstationary series display well-defined FC linkages that can be meaningfully interpreted as correlation measures involving the covariates. The present paper proposes new nonparametric estimators for such FC regression models where the nonstationary series display linkages that enable consistent estimation of the correlation measures between them. Specifically, we develop <span><math><msqrt><mrow><mi>n</mi></mrow></msqrt></math></span>-consistent estimators for the functional coefficient and establish their asymptotic distributions, which involve mixed normal limits that facilitate inference. Two novel features that appear in the limit theory are (i) the need for non-diagonal matrix normalization due to the presence of stationary and nonstationary components in the regression; and (ii) random bias elements that appear in the asymptotic distribution of the kernel estimators, again resulting from the nonstationary regression components. Numerical studies reveal that the proposed estimators achieve significant efficiency improvements compared to the estimators suggested in earlier work by Sun et al. (2011). Easily implementable specification tests with standard chi-square asymptotics are suggested to check for constancy of the functional coefficient. These tests are shown to have faster divergence rate under local alternatives and enjoy superior performance in simulations than tests proposed in Gan et al. (2014). An empirical application based on the quantity theory of money is included, illustrating the practical use of correlated but non-cointegrated regression relations.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105996"},"PeriodicalIF":9.9,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143642830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Adjustments with many regressors under covariate-adaptive randomizations
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-15 DOI: 10.1016/j.jeconom.2025.105991
Liang Jiang , Liyao Li , Ke Miao , Yichong Zhang
Our paper discovers a new trade-off of using regression adjustments (RAs) in causal inference under covariate-adaptive randomizations (CARs). On one hand, RAs can improve the efficiency of causal estimators by incorporating information from covariates that are not used in the randomization. On the other hand, RAs can degrade estimation efficiency due to their estimation errors, which are not asymptotically negligible when the number of regressors is of the same order as the sample size. Ignoring the estimation errors of RAs may result in serious over-rejection of causal inference under the null hypothesis. To address the issue, we construct a new ATE estimator by optimally linearly combining the estimators with and without RAs. We then develop a unified inference theory for this estimator under CARs. It has two features: (1) the Wald test based on it achieves the exact asymptotic size under the null hypothesis, regardless of whether the number of covariates is fixed or diverges no faster than the sample size; and (2) it guarantees weak efficiency improvement over estimators both with and without RAs.
{"title":"Adjustments with many regressors under covariate-adaptive randomizations","authors":"Liang Jiang ,&nbsp;Liyao Li ,&nbsp;Ke Miao ,&nbsp;Yichong Zhang","doi":"10.1016/j.jeconom.2025.105991","DOIUrl":"10.1016/j.jeconom.2025.105991","url":null,"abstract":"<div><div>Our paper discovers a new trade-off of using regression adjustments (RAs) in causal inference under covariate-adaptive randomizations (CARs). On one hand, RAs can improve the efficiency of causal estimators by incorporating information from covariates that are not used in the randomization. On the other hand, RAs can degrade estimation efficiency due to their estimation errors, which are not asymptotically negligible when the number of regressors is of the same order as the sample size. Ignoring the estimation errors of RAs may result in serious over-rejection of causal inference under the null hypothesis. To address the issue, we construct a new ATE estimator by optimally linearly combining the estimators with and without RAs. We then develop a unified inference theory for this estimator under CARs. It has two features: (1) the Wald test based on it achieves the exact asymptotic size under the null hypothesis, regardless of whether the number of covariates is fixed or diverges no faster than the sample size; and (2) it guarantees weak efficiency improvement over estimators both with and without RAs.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105991"},"PeriodicalIF":9.9,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143629093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Estimation and uniform inference in sparse high-dimensional additive models
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-05 DOI: 10.1016/j.jeconom.2025.105973
Philipp Bach , Sven Klaassen , Jannis Kueck , Martin Spindler
We develop a novel method to construct uniformly valid confidence bands for a nonparametric component f1 in the sparse additive model Y=f1(X1)++fp(Xp)+ɛ in a high-dimensional setting. Our method integrates sieve estimation into a high-dimensional Z-estimation framework, facilitating the construction of uniformly valid confidence bands for the target component f1. To form these confidence bands, we employ a multiplier bootstrap procedure. Additionally, we provide rates for the uniform lasso estimation in high dimensions, which may be of independent interest. Through simulation studies, we demonstrate that our proposed method delivers reliable results in terms of estimation and coverage, even in small samples.
{"title":"Estimation and uniform inference in sparse high-dimensional additive models","authors":"Philipp Bach ,&nbsp;Sven Klaassen ,&nbsp;Jannis Kueck ,&nbsp;Martin Spindler","doi":"10.1016/j.jeconom.2025.105973","DOIUrl":"10.1016/j.jeconom.2025.105973","url":null,"abstract":"<div><div>We develop a novel method to construct uniformly valid confidence bands for a nonparametric component <span><math><msub><mrow><mi>f</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span> in the sparse additive model <span><math><mrow><mi>Y</mi><mo>=</mo><msub><mrow><mi>f</mi></mrow><mrow><mn>1</mn></mrow></msub><mrow><mo>(</mo><msub><mrow><mi>X</mi></mrow><mrow><mn>1</mn></mrow></msub><mo>)</mo></mrow><mo>+</mo><mo>…</mo><mo>+</mo><msub><mrow><mi>f</mi></mrow><mrow><mi>p</mi></mrow></msub><mrow><mo>(</mo><msub><mrow><mi>X</mi></mrow><mrow><mi>p</mi></mrow></msub><mo>)</mo></mrow><mo>+</mo><mi>ɛ</mi></mrow></math></span> in a high-dimensional setting. Our method integrates sieve estimation into a high-dimensional Z-estimation framework, facilitating the construction of uniformly valid confidence bands for the target component <span><math><msub><mrow><mi>f</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>. To form these confidence bands, we employ a multiplier bootstrap procedure. Additionally, we provide rates for the uniform lasso estimation in high dimensions, which may be of independent interest. Through simulation studies, we demonstrate that our proposed method delivers reliable results in terms of estimation and coverage, even in small samples.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105973"},"PeriodicalIF":9.9,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143551236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Bootstrap based asymptotic refinements for high-dimensional nonlinear models
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-03 DOI: 10.1016/j.jeconom.2025.105977
Joel L. Horowitz , Ahnaf Rafi
We consider penalized extremum estimation of a high-dimensional, possibly nonlinear model that is sparse in the sense that most of its parameters are zero but some are not. We use the SCAD penalty function, which provides model selection consistent and oracle efficient estimates under suitable conditions. However, asymptotic approximations based on the oracle model can be inaccurate with the sample sizes found in many applications. This paper gives conditions under which the bootstrap, based on estimates obtained through SCAD penalization with thresholding, provides asymptotic refinements of size O(n2) for the error in the rejection (coverage) probability of a symmetric hypothesis test (confidence interval) and O(n1) for the error in the rejection (coverage) probability of a one-sided or equal tailed test (confidence interval). The results of Monte Carlo experiments show that the bootstrap can provide large reductions in errors in rejection and coverage probabilities. The bootstrap is consistent, though it does not necessarily provide asymptotic refinements, if some parameters are close but not equal to zero. Random-coefficients logit and probit models and nonlinear moment models are examples of models to which the procedure applies.
{"title":"Bootstrap based asymptotic refinements for high-dimensional nonlinear models","authors":"Joel L. Horowitz ,&nbsp;Ahnaf Rafi","doi":"10.1016/j.jeconom.2025.105977","DOIUrl":"10.1016/j.jeconom.2025.105977","url":null,"abstract":"<div><div>We consider penalized extremum estimation of a high-dimensional, possibly nonlinear model that is sparse in the sense that most of its parameters are zero but some are not. We use the SCAD penalty function, which provides model selection consistent and oracle efficient estimates under suitable conditions. However, asymptotic approximations based on the oracle model can be inaccurate with the sample sizes found in many applications. This paper gives conditions under which the bootstrap, based on estimates obtained through SCAD penalization with thresholding, provides asymptotic refinements of size <span><math><mrow><mi>O</mi><mo>(</mo><msup><mrow><mi>n</mi></mrow><mrow><mo>−</mo><mn>2</mn></mrow></msup><mo>)</mo></mrow></math></span> for the error in the rejection (coverage) probability of a symmetric hypothesis test (confidence interval) and <span><math><mrow><mi>O</mi><mo>(</mo><msup><mrow><mi>n</mi></mrow><mrow><mo>−</mo><mn>1</mn></mrow></msup><mo>)</mo></mrow></math></span> for the error in the rejection (coverage) probability of a one-sided or equal tailed test (confidence interval). The results of Monte Carlo experiments show that the bootstrap can provide large reductions in errors in rejection and coverage probabilities. The bootstrap is consistent, though it does not necessarily provide asymptotic refinements, if some parameters are close but not equal to zero. Random-coefficients logit and probit models and nonlinear moment models are examples of models to which the procedure applies.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105977"},"PeriodicalIF":9.9,"publicationDate":"2025-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143551233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Score-type tests for normal mixtures 正态混合物的分数型检验
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-01 DOI: 10.1016/j.jeconom.2024.105717
Dante Amengual , Xinyue Bei , Marine Carrasco , Enrique Sentana
Testing normality against discrete normal mixtures is complex because some parameters turn increasingly underidentified along alternative ways of approaching the null, others are inequality constrained, and several higher-order derivatives become identically 0. These problems make the maximum of the alternative model log-likelihood function numerically unreliable. We propose score-type tests asymptotically equivalent to the likelihood ratio as the largest of two simple intuitive statistics that only require estimation under the null. One novelty of our approach is that we treat symmetrically both ways of writing the null hypothesis without excluding any region of the parameter space. We derive the asymptotic distribution of our tests under the null and sequences of local alternatives. We also show that their asymptotic distribution is the same whether applied to observations or standardized residuals from heteroskedastic regression models. Finally, we study their power in simulations and apply them to the residuals of Mincer earnings functions.
离散正态混合物的正态性检验非常复杂,因为一些参数在接近空值的替代方法中变得越来越不确定,另一些参数受到不等式约束,还有一些高阶导数变得同为 0。我们提出的得分型检验在渐近上等同于似然比,是两个简单直观统计量中最大的一个,只需要在空值下进行估计。我们方法的一个新颖之处在于,我们对称地处理了两种无效假设的写法,而不排除参数空间的任何区域。我们推导出我们的检验在零假设和局部替代序列下的渐近分布。我们还证明,无论是应用于观测值还是异方差回归模型的标准化残差,它们的渐近分布都是相同的。最后,我们在模拟中研究了它们的威力,并将它们应用于 Mincer 收益函数的残差。
{"title":"Score-type tests for normal mixtures","authors":"Dante Amengual ,&nbsp;Xinyue Bei ,&nbsp;Marine Carrasco ,&nbsp;Enrique Sentana","doi":"10.1016/j.jeconom.2024.105717","DOIUrl":"10.1016/j.jeconom.2024.105717","url":null,"abstract":"<div><div>Testing normality against discrete normal mixtures is complex because some parameters turn increasingly underidentified along alternative ways of approaching the null, others are inequality constrained, and several higher-order derivatives become identically 0. These problems make the maximum of the alternative model log-likelihood function numerically unreliable. We propose score-type tests asymptotically equivalent to the likelihood ratio as the largest of two simple intuitive statistics that only require estimation under the null. One novelty of our approach is that we treat symmetrically both ways of writing the null hypothesis without excluding any region of the parameter space. We derive the asymptotic distribution of our tests under the null and sequences of local alternatives. We also show that their asymptotic distribution is the same whether applied to observations or standardized residuals from heteroskedastic regression models. Finally, we study their power in simulations and apply them to the residuals of Mincer earnings functions.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105717"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140154183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The chained difference-in-differences
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-01 DOI: 10.1016/j.jeconom.2024.105783
Christophe Bellégo , David Benatia , Vincent Dortet-Bernadet
This paper studies the identification, estimation, and inference of long-term (binary) treatment effect parameters when balanced panel data is not available, or consists of only a subset of the available data. We develop a new estimator: the chained difference-in-differences, which leverages the overlapping structure of many unbalanced panel data sets. This approach consists in aggregating a collection of short-term treatment effects estimated on multiple incomplete panels. Our estimator accommodates (1) multiple time periods, (2) variation in treatment timing, (3) treatment effect heterogeneity, (4) general missing data patterns, and (5) sample selection on observables. We establish the asymptotic properties of the proposed estimator and discuss identification and efficiency gains in comparison to existing methods. Finally, we illustrate its relevance through (i) numerical simulations, and (ii) an application about the effects of an innovation policy in France.
{"title":"The chained difference-in-differences","authors":"Christophe Bellégo ,&nbsp;David Benatia ,&nbsp;Vincent Dortet-Bernadet","doi":"10.1016/j.jeconom.2024.105783","DOIUrl":"10.1016/j.jeconom.2024.105783","url":null,"abstract":"<div><div>This paper studies the identification, estimation, and inference of long-term (binary) treatment effect parameters when balanced panel data is not available, or consists of only a subset of the available data. We develop a new estimator: the chained difference-in-differences, which leverages the overlapping structure of many unbalanced panel data sets. This approach consists in aggregating a collection of short-term treatment effects estimated on multiple incomplete panels. Our estimator accommodates (1) multiple time periods, (2) variation in treatment timing, (3) treatment effect heterogeneity, (4) general missing data patterns, and (5) sample selection on observables. We establish the asymptotic properties of the proposed estimator and discuss identification and efficiency gains in comparison to existing methods. Finally, we illustrate its relevance through (i) numerical simulations, and (ii) an application about the effects of an innovation policy in France.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105783"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143526861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The term structure of macroeconomic risks at the effective lower bound 宏观经济风险的期限结构处于有效下限
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-01 DOI: 10.1016/j.jeconom.2023.01.005
Guillaume Roussellet
This paper proposes a new macro-finance model that solves the tension between tractability, flexibility in macroeconomic dynamics, and consistency of the term structures of treasury yields with the effective lower bound (ELB). I use the term structures of U.S. nominal and real treasury yields from 1990 to explore the interdependence between inflation expectations, volatility, and monetary policy at the ELB. The estimation reveals that real yields stay elevated during the ELB due to large premia and deflation fears, produced by a persistent shift in inflation dynamics, with low average inflation and heightened inflation volatility.
{"title":"The term structure of macroeconomic risks at the effective lower bound","authors":"Guillaume Roussellet","doi":"10.1016/j.jeconom.2023.01.005","DOIUrl":"10.1016/j.jeconom.2023.01.005","url":null,"abstract":"<div><div>This paper proposes a new macro-finance model that solves the tension between tractability, flexibility in macroeconomic<span><span><span> dynamics, and consistency of the term structures of treasury yields with the effective lower bound (ELB). I use the term structures of U.S. nominal and real treasury yields from 1990 to explore the interdependence between </span>inflation expectations, volatility, and </span>monetary policy<span> at the ELB. The estimation reveals that real yields stay elevated during the ELB due to large premia and deflation fears, produced by a persistent shift in inflation<span> dynamics, with low average inflation and heightened inflation volatility.</span></span></span></div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105383"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45139733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Regularizing stock return covariance matrices via multiple testing of correlations 通过相关性多重测试正则化股票收益协方差矩阵
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-01 DOI: 10.1016/j.jeconom.2024.105753
Richard Luger
This paper develops a large-scale inference approach for the regularization of stock return covariance matrices. The framework allows for the presence of heavy tails and multivariate GARCH-type effects of unknown form among the stock returns. The approach involves simultaneous testing of all pairwise correlations, followed by setting non-statistically significant elements to zero. This adaptive thresholding is achieved through sign-based Monte Carlo resampling within multiple testing procedures, controlling either the traditional familywise error rate, a generalized familywise error rate, or the false discovery proportion. Subsequent shrinkage ensures that the final covariance matrix estimate is positive definite and well-conditioned while preserving the achieved sparsity. Compared to alternative estimators, this new regularization method demonstrates strong performance in simulation experiments and real portfolio optimization.
{"title":"Regularizing stock return covariance matrices via multiple testing of correlations","authors":"Richard Luger","doi":"10.1016/j.jeconom.2024.105753","DOIUrl":"10.1016/j.jeconom.2024.105753","url":null,"abstract":"<div><div>This paper develops a large-scale inference approach for the regularization of stock return covariance matrices. The framework allows for the presence of heavy tails and multivariate GARCH-type effects of unknown form among the stock returns. The approach involves simultaneous testing of all pairwise correlations, followed by setting non-statistically significant elements to zero. This adaptive thresholding is achieved through sign-based Monte Carlo resampling within multiple testing procedures, controlling either the traditional familywise error rate, a generalized familywise error rate, or the false discovery proportion. Subsequent shrinkage ensures that the final covariance matrix estimate is positive definite and well-conditioned while preserving the achieved sparsity. Compared to alternative estimators, this new regularization method demonstrates strong performance in simulation experiments and real portfolio optimization.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105753"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141056593","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Identification, inference and risk
IF 9.9 3区 经济学 Q1 ECONOMICS Pub Date : 2025-03-01 DOI: 10.1016/j.jeconom.2024.105938
Bertille Antoine, Patrick Gagliardini, René Garcia, Enrique Sentana
{"title":"Identification, inference and risk","authors":"Bertille Antoine,&nbsp;Patrick Gagliardini,&nbsp;René Garcia,&nbsp;Enrique Sentana","doi":"10.1016/j.jeconom.2024.105938","DOIUrl":"10.1016/j.jeconom.2024.105938","url":null,"abstract":"","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105938"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143526830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Journal of Econometrics
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1