Pub Date : 2023-09-11DOI: 10.1017/s0266466623000269
Benedikt M. Pötscher, David Preinerstorfer
Tests based on heteroskedasticity robust standard errors are an important technique in econometric practice. Choosing the right critical value, however, is not simple at all: conventional critical values based on asymptotics often lead to severe size distortions, and so do existing adjustments including the bootstrap. To avoid these issues, we suggest to use smallest size-controlling critical values, the generic existence of which we prove in this article for the commonly used test statistics. Furthermore, sufficient and often also necessary conditions for their existence are given that are easy to check. Granted their existence, these critical values are the canonical choice: larger critical values result in unnecessary power loss, whereas smaller critical values lead to overrejections under the null hypothesis, make spurious discoveries more likely, and thus are invalid. We suggest algorithms to numerically determine the proposed critical values and provide implementations in accompanying software. Finally, we numerically study the behavior of the proposed testing procedures, including their power properties.
{"title":"VALID HETEROSKEDASTICITY ROBUST TESTING","authors":"Benedikt M. Pötscher, David Preinerstorfer","doi":"10.1017/s0266466623000269","DOIUrl":"https://doi.org/10.1017/s0266466623000269","url":null,"abstract":"Tests based on heteroskedasticity robust standard errors are an important technique in econometric practice. Choosing the right critical value, however, is not simple at all: conventional critical values based on asymptotics often lead to severe size distortions, and so do existing adjustments including the bootstrap. To avoid these issues, we suggest to use smallest size-controlling critical values, the generic existence of which we prove in this article for the commonly used test statistics. Furthermore, sufficient and often also necessary conditions for their existence are given that are easy to check. Granted their existence, these critical values are the canonical choice: larger critical values result in unnecessary power loss, whereas smaller critical values lead to overrejections under the null hypothesis, make spurious discoveries more likely, and thus are invalid. We suggest algorithms to numerically determine the proposed critical values and provide implementations in accompanying software. Finally, we numerically study the behavior of the proposed testing procedures, including their power properties.","PeriodicalId":49275,"journal":{"name":"Econometric Theory","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135938444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-05DOI: 10.1017/s0266466623000270
Bin Peng, Liangjun Su, Joakim Westerlund, Yanrong Yang
This paper considers a model with general regressors and unobservable common factors. An estimator based on iterated principal component analysis is proposed, which is shown to be not only asymptotically normal, but under certain conditions also free of the otherwise so common asymptotic incidental parameters bias. Interestingly, the conditions required to achieve unbiasedness become weaker the stronger the trends in the factors, and if the trending is strong enough, unbiasedness comes at no cost at all. The approach does not require any knowledge of how many factors there are, or whether they are deterministic or stochastic. The order of integration of the factors is also treated as unknown, as is the order of integration of the regressors, which means that there is no need to pre-test for unit roots, or to decide on which deterministic terms to include in the model.
{"title":"INTERACTIVE EFFECTS PANEL DATA MODELS WITH GENERAL FACTORS AND REGRESSORS","authors":"Bin Peng, Liangjun Su, Joakim Westerlund, Yanrong Yang","doi":"10.1017/s0266466623000270","DOIUrl":"https://doi.org/10.1017/s0266466623000270","url":null,"abstract":"This paper considers a model with general regressors and unobservable common factors. An estimator based on iterated principal component analysis is proposed, which is shown to be not only asymptotically normal, but under certain conditions also free of the otherwise so common asymptotic incidental parameters bias. Interestingly, the conditions required to achieve unbiasedness become weaker the stronger the trends in the factors, and if the trending is strong enough, unbiasedness comes at no cost at all. The approach does not require any knowledge of how many factors there are, or whether they are deterministic or stochastic. The order of integration of the factors is also treated as unknown, as is the order of integration of the regressors, which means that there is no need to pre-test for unit roots, or to decide on which deterministic terms to include in the model.","PeriodicalId":49275,"journal":{"name":"Econometric Theory","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135205444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-08-02DOI: 10.1017/s026646662300021x
Zequn Jin, Yu Zhang, Zhengyu Zhang, Yahong Zhou
This study investigates the identification and inference of quantile treatment effects (QTEs) in a fuzzy regression discontinuity (RD) design under rank similarity . Unlike Frandsen et al. (2012, Journal of Econometrics 168, 382–395), who focus on QTEs only for the compliant subpopulation, our approach can identify QTEs and average treatment effect for the whole population at the threshold. We derived a new set of moment restrictions for the RD model by imposing a local rank similarity condition, which restricts the evolution of individual ranks across treatment status in a neighborhood around the threshold. Based on the moment restrictions, we derive closed-form solutions for the estimands of the potential outcome cumulative distribution functions for the whole population. We demonstrate the functional central limit theorems and bootstrap validity results for the QTE estimators by explicitly accounting for observed covariates. In particular, we develop a multiplier bootstrap-based inference method with robustness against large bandwidths that applies to uniform inference by extending the recent work of Chiang et al. (2019, Journal of Econometrics 211, 589–618). We also propose a test for the local rank similarity assumption. To illustrate the estimation approach and its properties, we provide a simulation study and estimate the impacts of India’s 40-billion-dollar national rural road construction program on the reallocation of labor out of agriculture.
本研究探讨了在等级相似的模糊回归不连续(RD)设计中分位数处理效果(qte)的识别和推断。与Frandsen等人(2012,Journal of Econometrics 168, 382-395)只关注依从性亚人群的qte不同,我们的方法可以在阈值处识别整个人群的qte和平均治疗效果。我们通过施加一个局部等级相似条件,为RD模型导出了一组新的矩约束,该矩约束在阈值附近的邻域内限制个体等级在不同处理状态下的演化。在矩约束的基础上,导出了总体潜在结果累积分布函数估计的封闭解。通过显式地考虑观察到的协变量,我们证明了QTE估计器的泛函中心极限定理和自举有效性结果。特别是,我们通过扩展Chiang等人最近的工作(2019,Journal of Econometrics 211, 589-618),开发了一种基于乘数自举的推理方法,该方法对大带宽具有鲁棒性,适用于统一推理。我们还提出了对局部等级相似假设的检验。为了说明估算方法及其性质,我们提供了一个模拟研究,并估计了印度400亿美元的国家农村道路建设计划对农业劳动力再分配的影响。
{"title":"IDENTIFICATION AND INFERENCE IN A QUANTILE REGRESSION DISCONTINUITY DESIGN UNDER RANK SIMILARITY WITH COVARIATES","authors":"Zequn Jin, Yu Zhang, Zhengyu Zhang, Yahong Zhou","doi":"10.1017/s026646662300021x","DOIUrl":"https://doi.org/10.1017/s026646662300021x","url":null,"abstract":"This study investigates the identification and inference of quantile treatment effects (QTEs) in a fuzzy regression discontinuity (RD) design under rank similarity . Unlike Frandsen et al. (2012, Journal of Econometrics 168, 382–395), who focus on QTEs only for the compliant subpopulation, our approach can identify QTEs and average treatment effect for the whole population at the threshold. We derived a new set of moment restrictions for the RD model by imposing a local rank similarity condition, which restricts the evolution of individual ranks across treatment status in a neighborhood around the threshold. Based on the moment restrictions, we derive closed-form solutions for the estimands of the potential outcome cumulative distribution functions for the whole population. We demonstrate the functional central limit theorems and bootstrap validity results for the QTE estimators by explicitly accounting for observed covariates. In particular, we develop a multiplier bootstrap-based inference method with robustness against large bandwidths that applies to uniform inference by extending the recent work of Chiang et al. (2019, Journal of Econometrics 211, 589–618). We also propose a test for the local rank similarity assumption. To illustrate the estimation approach and its properties, we provide a simulation study and estimate the impacts of India’s 40-billion-dollar national rural road construction program on the reallocation of labor out of agriculture.","PeriodicalId":49275,"journal":{"name":"Econometric Theory","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135015392","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-05eCollection Date: 2023-06-01DOI: 10.2478/jtim-2023-0088
Zeye Liu, Yuan Huang, Hang Li, Wenchao Li, Fengwen Zhang, Wenbin Ouyang, Shouzheng Wang, Zhiling Luo, Jinduo Wang, Yan Chen, Ruibing Xia, Yakun Li, Xiangbin Pan
Objective: Echocardiography (ECG) is the most common method used to diagnose heart failure (HF). However, its accuracy relies on the experience of the operator. Additionally, the video format of the data makes it challenging for patients to bring them to referrals and reexaminations. Therefore, this study used a deep learning approach to assist physicians in assessing cardiac function to promote the standardization of echocardiographic findings and compatibility of dynamic and static ultrasound data.
Methods: A deep spatio-temporal convolutional model r2plus1d-Pan (trained on dynamic data and applied to static data) was improved and trained using the idea of "regression training combined with classification application," which can be generalized to dynamic ECG and static cardiac ultrasound views to identify HF with a reduced ejection fraction (EF < 40%). Additionally, three independent datasets containing 8976 cardiac ultrasound views and 10085 cardiac ultrasound videos were established. Subsequently, a multinational, multi-center dataset of EF was labeled. Furthermore, model training and independent validation were performed. Finally, 15 registered ultrasonographers and cardiologists with different working years in three regional hospitals specialized in cardiovascular disease were recruited to compare the results.
Results: The proposed deep spatio-temporal convolutional model achieved an area under the receiveroperating characteristic curve (AUC) value of 0.95 (95% confidence interval [CI]: 0.947 to 0.953) on the training set of dynamic ultrasound data and an AUC of 1 (95% CI, 1 to 1) on the independent validation set. Subsequently, the model was applied to the static cardiac ultrasound view (validation set) with simultaneous input of 1, 2, 4, and 8 images of the same heart, with classification accuracies of 85%, 81%, 93%, and 92%, respectively. On the static data, the classification accuracy of the artificial intelligence (AI) model was comparable with the best performance of ultrasonographers and cardiologists with more than 3 working years (P = 0.344), but significantly better than the median level (P = 0.0000008).
Conclusion: A new deep spatio-temporal convolution model was constructed to identify patients with HF with reduced EF accurately (< 40%) using dynamic and static cardiac ultrasound images. The model outperformed the diagnostic performance of most senior specialists. This may be the first HF-related AI diagnostic model compatible with multi-dimensional cardiac ultrasound data, and may thereby contribute to the improvement of HF diagnosis. Additionally, the model enables patients to carry "on-the-go" static ultrasound reports for referral and reexamination, thus saving healthcare resources.
{"title":"A generalized deep learning model for heart failure diagnosis using dynamic and static ultrasound.","authors":"Zeye Liu, Yuan Huang, Hang Li, Wenchao Li, Fengwen Zhang, Wenbin Ouyang, Shouzheng Wang, Zhiling Luo, Jinduo Wang, Yan Chen, Ruibing Xia, Yakun Li, Xiangbin Pan","doi":"10.2478/jtim-2023-0088","DOIUrl":"10.2478/jtim-2023-0088","url":null,"abstract":"<p><strong>Objective: </strong>Echocardiography (ECG) is the most common method used to diagnose heart failure (HF). However, its accuracy relies on the experience of the operator. Additionally, the video format of the data makes it challenging for patients to bring them to referrals and reexaminations. Therefore, this study used a deep learning approach to assist physicians in assessing cardiac function to promote the standardization of echocardiographic findings and compatibility of dynamic and static ultrasound data.</p><p><strong>Methods: </strong>A deep spatio-temporal convolutional model r2plus1d-Pan (trained on dynamic data and applied to static data) was improved and trained using the idea of \"regression training combined with classification application,\" which can be generalized to dynamic ECG and static cardiac ultrasound views to identify HF with a reduced ejection fraction (EF < 40%). Additionally, three independent datasets containing 8976 cardiac ultrasound views and 10085 cardiac ultrasound videos were established. Subsequently, a multinational, multi-center dataset of EF was labeled. Furthermore, model training and independent validation were performed. Finally, 15 registered ultrasonographers and cardiologists with different working years in three regional hospitals specialized in cardiovascular disease were recruited to compare the results.</p><p><strong>Results: </strong>The proposed deep spatio-temporal convolutional model achieved an area under the receiveroperating characteristic curve (AUC) value of 0.95 (95% confidence interval [CI]: 0.947 to 0.953) on the training set of dynamic ultrasound data and an AUC of 1 (95% CI, 1 to 1) on the independent validation set. Subsequently, the model was applied to the static cardiac ultrasound view (validation set) with simultaneous input of 1, 2, 4, and 8 images of the same heart, with classification accuracies of 85%, 81%, 93%, and 92%, respectively. On the static data, the classification accuracy of the artificial intelligence (AI) model was comparable with the best performance of ultrasonographers and cardiologists with more than 3 working years (P = 0.344), but significantly better than the median level (<i>P</i> = 0.0000008).</p><p><strong>Conclusion: </strong>A new deep spatio-temporal convolution model was constructed to identify patients with HF with reduced EF accurately (< 40%) using dynamic and static cardiac ultrasound images. The model outperformed the diagnostic performance of most senior specialists. This may be the first HF-related AI diagnostic model compatible with multi-dimensional cardiac ultrasound data, and may thereby contribute to the improvement of HF diagnosis. Additionally, the model enables patients to carry \"on-the-go\" static ultrasound reports for referral and reexamination, thus saving healthcare resources.</p>","PeriodicalId":49275,"journal":{"name":"Econometric Theory","volume":"36 1","pages":"138-144"},"PeriodicalIF":4.9,"publicationDate":"2023-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10680380/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78361499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1017/s026646662300018x
A. Atak, Thomas Tao, Yonghui Zhang, Qiankun Zhou
This paper provides nonparametric specification tests for the commonly used homogeneous and stable coefficients structures in panel data models. We first obtain the augmented residuals by estimating the model under the null hypothesis and then run auxiliary time series regressions of augmented residuals on covariates with time-varying coefficients (TVCs) via sieve methods. The test statistic is then constructed by averaging the squared fitted values, which are close to zero under the null and deviate from zero under the alternatives. We show that the test statistic, after being appropriately standardized, is asymptotically normal under the null and under a sequence of Pitman local alternatives. A bootstrap procedure is proposed to improve the finite sample performance of our test. In addition, we extend the procedure to test other structures, such as the homogeneity of TVCs or the stability of heterogeneous coefficients. The joint test is extended to panel models with two-way fixed effects. Monte Carlo simulations indicate that our tests perform reasonably well in finite samples. We apply the tests to re-examine the environmental Kuznets curve in the United States, and find that the model with homogenous TVCs is more appropriate for this application.
{"title":"SPECIFICATION TESTS FOR TIME-VARYING COEFFICIENT PANEL DATA MODELS","authors":"A. Atak, Thomas Tao, Yonghui Zhang, Qiankun Zhou","doi":"10.1017/s026646662300018x","DOIUrl":"https://doi.org/10.1017/s026646662300018x","url":null,"abstract":"This paper provides nonparametric specification tests for the commonly used homogeneous and stable coefficients structures in panel data models. We first obtain the augmented residuals by estimating the model under the null hypothesis and then run auxiliary time series regressions of augmented residuals on covariates with time-varying coefficients (TVCs) via sieve methods. The test statistic is then constructed by averaging the squared fitted values, which are close to zero under the null and deviate from zero under the alternatives. We show that the test statistic, after being appropriately standardized, is asymptotically normal under the null and under a sequence of Pitman local alternatives. A bootstrap procedure is proposed to improve the finite sample performance of our test. In addition, we extend the procedure to test other structures, such as the homogeneity of TVCs or the stability of heterogeneous coefficients. The joint test is extended to panel models with two-way fixed effects. Monte Carlo simulations indicate that our tests perform reasonably well in finite samples. We apply the tests to re-examine the environmental Kuznets curve in the United States, and find that the model with homogenous TVCs is more appropriate for this application.","PeriodicalId":49275,"journal":{"name":"Econometric Theory","volume":"1 1","pages":""},"PeriodicalIF":0.8,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41861207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-19DOI: 10.1017/s0266466623000178
V. Todorov, Yang Zhang
We propose a test for anticipated changes in spot volatility, either due to continuous or discontinuous price moves, at the times of realization of event risk in the form of pre-scheduled releases of economic information such as earnings announcements by firms and macroeconomic news announcements. These events can generate nontrivial volatility in asset returns, which does not scale even locally in time. Our test is based on short-dated options written on an underlying asset subject to event risk, which takes place after the options’ observation time and prior to or after their expiration. We use options with different tenors to estimate the conditional (risk-neutral) characteristic functions of the underlying asset log-returns over the horizons of the options. Using these estimates and a relationship between the conditional characteristic functions with three different tenors, which holds true if and only if continuous and discontinuous spot volatility does not change at the event time, we design a test for this hypothesis. In an empirical application, we study anticipated individual stocks’ volatility changes following earnings announcements for a set of stocks with good option coverage.
{"title":"TESTING FOR ANTICIPATED CHANGES IN SPOT VOLATILITY AT EVENT TIMES","authors":"V. Todorov, Yang Zhang","doi":"10.1017/s0266466623000178","DOIUrl":"https://doi.org/10.1017/s0266466623000178","url":null,"abstract":"We propose a test for anticipated changes in spot volatility, either due to continuous or discontinuous price moves, at the times of realization of event risk in the form of pre-scheduled releases of economic information such as earnings announcements by firms and macroeconomic news announcements. These events can generate nontrivial volatility in asset returns, which does not scale even locally in time. Our test is based on short-dated options written on an underlying asset subject to event risk, which takes place after the options’ observation time and prior to or after their expiration. We use options with different tenors to estimate the conditional (risk-neutral) characteristic functions of the underlying asset log-returns over the horizons of the options. Using these estimates and a relationship between the conditional characteristic functions with three different tenors, which holds true if and only if continuous and discontinuous spot volatility does not change at the event time, we design a test for this hypothesis. In an empirical application, we study anticipated individual stocks’ volatility changes following earnings announcements for a set of stocks with good option coverage.","PeriodicalId":49275,"journal":{"name":"Econometric Theory","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2023-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46458443","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}