{"title":"嘉宾评论:贝叶斯计量经济学方法研讨会","authors":"Rodney W. Strachan","doi":"10.15353/rea.v2i2.1467","DOIUrl":null,"url":null,"abstract":"The 3rd RCEA Bayesian workshop was held in Rimini in July, 2009. The workshop opened with a debate, chaired by Gael Martin, between a leading Bayesian econometrician, Christian Robert, and a leading classical econometrician, Russell Davidson, on the relative virtues of the Bayesian and classical (or frequentist) approaches. The pleasantly civil debate was conducted under the topic \"The 21st Century Belongs to Bayes\". The workshop also brought together a variety of both classical and Bayesian econometricians and statisticians, with a view to participants exchanging information on developments in their specific fields of research. The two papers of this volume one classical in approach and one Bayesian, with insights into classical approaches partially reflect this purpose, with many more papers having been presented at the workshop itself. In his paper, Russell Davidson investigates issues in bootstrap testing using, as an example, testing for a unit root in an autoregressive moving average (ARMA) (1,1) process. He focuses on the situation when the root of the MA polynomial is close to minus one. Size distortions in the bootstrap tests result when testing in this situation, due to the near cancellation of the unit AR root with the MA root. Davidson proposes estimators based on nonlinear least squares that are faster to compute than, but not quite as efficient as, the maximum likelihood estimator. These estimators are slower to compute than those proposed by Galbraith and Zinde-Walsh (1994) and (1997), but far more efficient than the Galbraith and Zinde-Walsh estimators. Further, Davidson proposes a new bootstrap procedure that is computationally less demanding than the double bootstrap of Beran (1988). Davidson produces all of his results without recourse to asymptotic theory or asymptotic refinements of the testing procedure. Polasek, Sellner and Llano consider the problem of estimation and prediction in spatial models when data measurements are taken at different degrees of aggregation and where some observations are missing at one or more levels of aggregation. They adapt the procedure of Chow and Lin (1971), which was developed for time series data with observations on","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":null,"pages":null},"PeriodicalIF":0.7000,"publicationDate":"2010-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Guest Editorial: Workshop on Bayesian Econometric Methods\",\"authors\":\"Rodney W. Strachan\",\"doi\":\"10.15353/rea.v2i2.1467\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The 3rd RCEA Bayesian workshop was held in Rimini in July, 2009. The workshop opened with a debate, chaired by Gael Martin, between a leading Bayesian econometrician, Christian Robert, and a leading classical econometrician, Russell Davidson, on the relative virtues of the Bayesian and classical (or frequentist) approaches. The pleasantly civil debate was conducted under the topic \\\"The 21st Century Belongs to Bayes\\\". The workshop also brought together a variety of both classical and Bayesian econometricians and statisticians, with a view to participants exchanging information on developments in their specific fields of research. The two papers of this volume one classical in approach and one Bayesian, with insights into classical approaches partially reflect this purpose, with many more papers having been presented at the workshop itself. In his paper, Russell Davidson investigates issues in bootstrap testing using, as an example, testing for a unit root in an autoregressive moving average (ARMA) (1,1) process. He focuses on the situation when the root of the MA polynomial is close to minus one. Size distortions in the bootstrap tests result when testing in this situation, due to the near cancellation of the unit AR root with the MA root. Davidson proposes estimators based on nonlinear least squares that are faster to compute than, but not quite as efficient as, the maximum likelihood estimator. These estimators are slower to compute than those proposed by Galbraith and Zinde-Walsh (1994) and (1997), but far more efficient than the Galbraith and Zinde-Walsh estimators. Further, Davidson proposes a new bootstrap procedure that is computationally less demanding than the double bootstrap of Beran (1988). Davidson produces all of his results without recourse to asymptotic theory or asymptotic refinements of the testing procedure. Polasek, Sellner and Llano consider the problem of estimation and prediction in spatial models when data measurements are taken at different degrees of aggregation and where some observations are missing at one or more levels of aggregation. They adapt the procedure of Chow and Lin (1971), which was developed for time series data with observations on\",\"PeriodicalId\":42350,\"journal\":{\"name\":\"Review of Economic Analysis\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2010-08-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Review of Economic Analysis\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.15353/rea.v2i2.1467\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ECONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Review of Economic Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15353/rea.v2i2.1467","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ECONOMICS","Score":null,"Total":0}
Guest Editorial: Workshop on Bayesian Econometric Methods
The 3rd RCEA Bayesian workshop was held in Rimini in July, 2009. The workshop opened with a debate, chaired by Gael Martin, between a leading Bayesian econometrician, Christian Robert, and a leading classical econometrician, Russell Davidson, on the relative virtues of the Bayesian and classical (or frequentist) approaches. The pleasantly civil debate was conducted under the topic "The 21st Century Belongs to Bayes". The workshop also brought together a variety of both classical and Bayesian econometricians and statisticians, with a view to participants exchanging information on developments in their specific fields of research. The two papers of this volume one classical in approach and one Bayesian, with insights into classical approaches partially reflect this purpose, with many more papers having been presented at the workshop itself. In his paper, Russell Davidson investigates issues in bootstrap testing using, as an example, testing for a unit root in an autoregressive moving average (ARMA) (1,1) process. He focuses on the situation when the root of the MA polynomial is close to minus one. Size distortions in the bootstrap tests result when testing in this situation, due to the near cancellation of the unit AR root with the MA root. Davidson proposes estimators based on nonlinear least squares that are faster to compute than, but not quite as efficient as, the maximum likelihood estimator. These estimators are slower to compute than those proposed by Galbraith and Zinde-Walsh (1994) and (1997), but far more efficient than the Galbraith and Zinde-Walsh estimators. Further, Davidson proposes a new bootstrap procedure that is computationally less demanding than the double bootstrap of Beran (1988). Davidson produces all of his results without recourse to asymptotic theory or asymptotic refinements of the testing procedure. Polasek, Sellner and Llano consider the problem of estimation and prediction in spatial models when data measurements are taken at different degrees of aggregation and where some observations are missing at one or more levels of aggregation. They adapt the procedure of Chow and Lin (1971), which was developed for time series data with observations on