This paper reviews the recent option pricing literature and investigates how clustering and classification can assist option pricing models. Specifically, we consider non-parametric modular neural network (MNN) models to price the S&P-500 European call options. The focus is on decomposing and classifying options data into a number of sub-models across moneyness and maturity ranges that are processed individually. The fuzzy learning vector quantization (FLVQ) algorithm we propose generates decision regions (i.e., option classes) divided by ‘intelligent’ classification boundaries. Such an approach improves generalization properties of the MNN model and thereby increases its pricing accuracy.
{"title":"Clustering and Classification in Option Pricing","authors":"N. Gradojevic, D. Kukolj, R. Gencay","doi":"10.15353/rea.v3i2.1458","DOIUrl":"https://doi.org/10.15353/rea.v3i2.1458","url":null,"abstract":"This paper reviews the recent option pricing literature and investigates how clustering and classification can assist option pricing models. Specifically, we consider non-parametric modular neural network (MNN) models to price the S&P-500 European call options. The focus is on decomposing and classifying options data into a number of sub-models across moneyness and maturity ranges that are processed individually. The fuzzy learning vector quantization (FLVQ) algorithm we propose generates decision regions (i.e., option classes) divided by ‘intelligent’ classification boundaries. Such an approach improves generalization properties of the MNN model and thereby increases its pricing accuracy.","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"52 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2011-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903770","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper analyzes convergence of unemployment rates in Poland at NUTS4 level by testing nonlinear convergence, applying the modied KSS-CHLL for each pair of territorial units. The results suggest that actually the convergence is a rare phenomenon and occurs only in 1916 cases out of potential over 70 000 combinations. This paper inquires what systematic reasons contribute to this phenomenon. There are some circumstances under which unemployment convergence should be more awaited than in the others. These include sharing a higher level territorial authority, experiencing similar labour market hardship or sharing the same structural characteristics. For each of these three criteria we analyse the frequency of the dierential nonstationarity within groups (as evidence of convergence) and across groups (as evidence of "catching up").
{"title":"Nonlinear Stochastic Convergence Analysis of Regional Unemployment Rates in Poland","authors":"Joanna Tyrowicz, Piotr Wójcik","doi":"10.15353/rea.v3i1.1377","DOIUrl":"https://doi.org/10.15353/rea.v3i1.1377","url":null,"abstract":"This paper analyzes convergence of unemployment rates in Poland at NUTS4 level by testing nonlinear convergence, applying the modied KSS-CHLL for each pair of territorial units. The results suggest that actually the convergence is a rare phenomenon and occurs only in 1916 cases out of potential over 70 000 combinations. This paper inquires what systematic reasons contribute to this phenomenon. There are some circumstances under which unemployment convergence should be more awaited than in the others. These include sharing a higher level territorial authority, experiencing similar labour market hardship or sharing the same structural characteristics. For each of these three criteria we analyse the frequency of the dierential nonstationarity within groups (as evidence of convergence) and across groups (as evidence of \"catching up\").","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2011-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903531","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The main aim of this paper is to provide a spatial modelling framework for labour force participation and income estimation. The development of a household income distribution for Ireland had previously been hampered by the lack of disaggregated data on individual earnings. Spatial microsimulation through a process of calibration provides a method which allows one to recreate the spatial distribution LFP and household market income at the small area level. Further analysis examines the relationship between LFP, occupational type and market income at the small area level in Co. Galway Ireland.
{"title":"The Spatial Distribution of Labour Force Participation and Market Earnings at the Sub-National Level in Ireland","authors":"Karyn Morrissey, C. O’Donoghue","doi":"10.15353/rea.v3i1.1378","DOIUrl":"https://doi.org/10.15353/rea.v3i1.1378","url":null,"abstract":"The main aim of this paper is to provide a spatial modelling framework for labour force \u0000participation and income estimation. The development of a household income distribution for Ireland had previously been hampered by the lack of disaggregated data on individual earnings. Spatial microsimulation through a process of calibration provides a method which allows one to recreate the spatial distribution LFP and household market income at the small area level. Further analysis examines the relationship between LFP, occupational type and market income at the small area level in Co. Galway Ireland.","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2011-01-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We develop a theoretical framework exploring firm corruption accounting for interactions with an auditor who provides auditing and other services. A multiplicity of equilibria can exist including stable corruption and auditor controlled corruption. Whilst fining the auditor cannot eliminate all corruption, fining the firm can, but marginal increases in this fine can also have perverse effects. Investing in corruption detection may be effective in deterring auditor corruption but ineffective in deterring firm corruption. Policy effectiveness is highly dependent upon several factors which may be hard to observe in practice making general rules about policy interventions to address corruption very difficult.
{"title":"Firm Corruption in the Presence of an Auditor","authors":"M. Dietrich, Jolian P Mchardy, Abhijit Sharma","doi":"10.15353/rea.v8i2.1511","DOIUrl":"https://doi.org/10.15353/rea.v8i2.1511","url":null,"abstract":"We develop a theoretical framework exploring firm corruption accounting for interactions with an auditor who provides auditing and other services. A multiplicity of equilibria can exist including stable corruption and auditor controlled corruption. Whilst fining the auditor cannot eliminate all corruption, fining the firm can, but marginal increases in this fine can also have perverse effects. Investing in corruption detection may be effective in deterring auditor corruption but ineffective in deterring firm corruption. Policy effectiveness is highly dependent upon several factors which may be hard to observe in practice making general rules about policy interventions to address corruption very difficult.","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66906173","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chow and Lin (1971) were the first to develop a unified framework for the three problems(interpolation, extrapolation and distribution) of predicting times series by related series(the ‘indicators’). This paper develops a spatial Chow-Lin procedure for cross-sectional data and compares the classical and Bayesian estimation methods. We outline the error covariance structure in a spatial context and derive the BLUE for ML and Bayesian MCMC estimation. In an example, we apply the procedure to Spanish regional GDP data between2000 and 2004. We assume that only NUTS-2 GDP is known and predict GDP at NUTS-3level by using socio-economic and spatial information available at NUTS-3. The spatial neighbourhood is defined by either km distance, travel time, contiguity or trade relationships. After running some sensitivity analysis, we present the forecast accuracy criteria comparing the predicted values with the observed ones.
{"title":"Bayesian Methods for Completing Data in Spatial Models","authors":"W. Polasek, Carlos Llano, Richard Sellner","doi":"10.15353/rea.v2i2.1472","DOIUrl":"https://doi.org/10.15353/rea.v2i2.1472","url":null,"abstract":"Chow and Lin (1971) were the first to develop a unified framework for the three problems(interpolation, extrapolation and distribution) of predicting times series by related series(the ‘indicators’). This paper develops a spatial Chow-Lin procedure for cross-sectional data and compares the classical and Bayesian estimation methods. We outline the error covariance structure in a spatial context and derive the BLUE for ML and Bayesian MCMC estimation. In an example, we apply the procedure to Spanish regional GDP data between2000 and 2004. We assume that only NUTS-2 GDP is known and predict GDP at NUTS-3level by using socio-economic and spatial information available at NUTS-3. The spatial neighbourhood is defined by either km distance, travel time, contiguity or trade relationships. After running some sensitivity analysis, we present the forecast accuracy criteria comparing the predicted values with the observed ones.","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2010-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Never before has finance been regarded with the suspicion of these past three years. Yet money and credit are part of what enabled human beings to overcome the barbarism of immanence, the savagery of a life ruled by consumption for survival. The critical rethinking of finance involves the very notion of market economy and the globalized dimension it has reached. Who failed in this crisis, the State or the Market? The State, although by virtue of a paradox. Is globalization the culprit? No. A different financial system must emerge from the 2007-09 crisis: freed from the “idolatry of laissez-faire”, but not thrown back into obsolete technologies or permanently restrained by anachronistic “real socialism” bridles.
{"title":"Finance, Market, Globalization: Lessons from the 2007-09 Crisis","authors":"S. Rossi","doi":"10.15353/rea.v2i3.1368","DOIUrl":"https://doi.org/10.15353/rea.v2i3.1368","url":null,"abstract":"Never before has finance been regarded with the suspicion of these past three years. Yet money and credit are part of what enabled human beings to overcome the barbarism of immanence, the savagery of a life ruled by consumption for survival. The critical rethinking of finance involves the very notion of market economy and the globalized dimension it has reached. Who failed in this crisis, the State or the Market? The State, although by virtue of a paradox. Is globalization the culprit? No. A different financial system must emerge from the 2007-09 crisis: freed from the “idolatry of laissez-faire”, but not thrown back into obsolete technologies or permanently restrained by anachronistic “real socialism” bridles.","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2010-08-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903877","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bayesians and non-Bayesians, often called frequentists, seem to be perpetually at loggerheads on fundamental questions of statistical inference. This paper takes as agnostic a stand as is possible for a practising frequentist, and tries to elicit a Bayesian answer to questions of interest to frequentists. The argument is based on my presentation at a debate organised by the Rimini Centre for Economic Analysis, between me as the frequentist “advocate”, and Christian Robert on the Bayesian side.
贝叶斯派和非贝叶斯派,通常被称为频率论者,似乎在统计推断的基本问题上永远存在分歧。本文采取了不可知论的立场,作为一个实践性的频率论者,并试图引出一个贝叶斯答案的问题感兴趣的频率论者。这一论点是基于我在里米尼经济分析中心(Rimini Centre for Economic Analysis)组织的一场辩论上的发言,我是频率论的“倡导者”,而克里斯蒂安•罗伯特(Christian Robert)是贝叶斯派。
{"title":"An Agnostic Look at Bayesian Statistics and Econometrics","authors":"R. Davidson","doi":"10.15353/rea.v2i2.1470","DOIUrl":"https://doi.org/10.15353/rea.v2i2.1470","url":null,"abstract":"Bayesians and non-Bayesians, often called frequentists, seem to be perpetually at loggerheads on fundamental questions of statistical inference. This paper takes as agnostic a stand as is possible for a practising frequentist, and tries to elicit a Bayesian answer to questions of interest to frequentists. The argument is based on my presentation at a debate organised by the Rimini Centre for Economic Analysis, between me as the frequentist “advocate”, and Christian Robert on the Bayesian side.","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2010-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903078","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Testing for a unit root in a series obtained by summing a stationary MA(1) process with a parameter close to -1 leads to serious size distortions under the null, on account of the near cancellation of the unit root by the MA component in the driving stationary series. The situation is analysed from the point of view of bootstrap testing, and an exact quantitative account is given of the error in rejection probability of a bootstrap test. A particular method of estimating the MA parameter is recommended, as it leads to very little distortion even when the MA parameter is close to -1. A new bootstrap procedure with still better properties is proposed. While more computationally demanding than the usual bootstrap, it is much less so than the double bootstrap.
{"title":"Size Distortion of Bootstrap Tests: an Example from Unit Root Testing","authors":"R. Davidson","doi":"10.15353/rea.v2i2.1471","DOIUrl":"https://doi.org/10.15353/rea.v2i2.1471","url":null,"abstract":"Testing for a unit root in a series obtained by summing a stationary MA(1) process with a parameter close to -1 leads to serious size distortions under the null, on account of the near cancellation of the unit root by the MA component in the driving stationary series. The situation is analysed from the point of view of bootstrap testing, and an exact quantitative account is given of the error in rejection probability of a bootstrap test. A particular method of estimating the MA parameter is recommended, as it leads to very little distortion even when the MA parameter is close to -1. A new bootstrap procedure with still better properties is proposed. While more computationally demanding than the usual bootstrap, it is much less so than the double bootstrap.","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2010-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The 3rd RCEA Bayesian workshop was held in Rimini in July, 2009. The workshop opened with a debate, chaired by Gael Martin, between a leading Bayesian econometrician, Christian Robert, and a leading classical econometrician, Russell Davidson, on the relative virtues of the Bayesian and classical (or frequentist) approaches. The pleasantly civil debate was conducted under the topic "The 21st Century Belongs to Bayes". The workshop also brought together a variety of both classical and Bayesian econometricians and statisticians, with a view to participants exchanging information on developments in their specific fields of research. The two papers of this volume one classical in approach and one Bayesian, with insights into classical approaches partially reflect this purpose, with many more papers having been presented at the workshop itself. In his paper, Russell Davidson investigates issues in bootstrap testing using, as an example, testing for a unit root in an autoregressive moving average (ARMA) (1,1) process. He focuses on the situation when the root of the MA polynomial is close to minus one. Size distortions in the bootstrap tests result when testing in this situation, due to the near cancellation of the unit AR root with the MA root. Davidson proposes estimators based on nonlinear least squares that are faster to compute than, but not quite as efficient as, the maximum likelihood estimator. These estimators are slower to compute than those proposed by Galbraith and Zinde-Walsh (1994) and (1997), but far more efficient than the Galbraith and Zinde-Walsh estimators. Further, Davidson proposes a new bootstrap procedure that is computationally less demanding than the double bootstrap of Beran (1988). Davidson produces all of his results without recourse to asymptotic theory or asymptotic refinements of the testing procedure. Polasek, Sellner and Llano consider the problem of estimation and prediction in spatial models when data measurements are taken at different degrees of aggregation and where some observations are missing at one or more levels of aggregation. They adapt the procedure of Chow and Lin (1971), which was developed for time series data with observations on
{"title":"Guest Editorial: Workshop on Bayesian Econometric Methods","authors":"Rodney W. Strachan","doi":"10.15353/rea.v2i2.1467","DOIUrl":"https://doi.org/10.15353/rea.v2i2.1467","url":null,"abstract":"The 3rd RCEA Bayesian workshop was held in Rimini in July, 2009. The workshop opened with a debate, chaired by Gael Martin, between a leading Bayesian econometrician, Christian Robert, and a leading classical econometrician, Russell Davidson, on the relative virtues of the Bayesian and classical (or frequentist) approaches. The pleasantly civil debate was conducted under the topic \"The 21st Century Belongs to Bayes\". The workshop also brought together a variety of both classical and Bayesian econometricians and statisticians, with a view to participants exchanging information on developments in their specific fields of research. The two papers of this volume one classical in approach and one Bayesian, with insights into classical approaches partially reflect this purpose, with many more papers having been presented at the workshop itself. In his paper, Russell Davidson investigates issues in bootstrap testing using, as an example, testing for a unit root in an autoregressive moving average (ARMA) (1,1) process. He focuses on the situation when the root of the MA polynomial is close to minus one. Size distortions in the bootstrap tests result when testing in this situation, due to the near cancellation of the unit AR root with the MA root. Davidson proposes estimators based on nonlinear least squares that are faster to compute than, but not quite as efficient as, the maximum likelihood estimator. These estimators are slower to compute than those proposed by Galbraith and Zinde-Walsh (1994) and (1997), but far more efficient than the Galbraith and Zinde-Walsh estimators. Further, Davidson proposes a new bootstrap procedure that is computationally less demanding than the double bootstrap of Beran (1988). Davidson produces all of his results without recourse to asymptotic theory or asymptotic refinements of the testing procedure. Polasek, Sellner and Llano consider the problem of estimation and prediction in spatial models when data measurements are taken at different degrees of aggregation and where some observations are missing at one or more levels of aggregation. They adapt the procedure of Chow and Lin (1971), which was developed for time series data with observations on","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2010-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66902952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The primary aim of the Rimini debate was to highlight the value of both the Bayesian and frequentist (classical) paradigms, and the contributions that both make to statistical practice in the applied sciences. That said, I (the organizers?) did not wish the exercise to be a sanitized one, with the shortcomings of both methods also to be confronted. Hence, a topic was chosen that was provocative enough to bring those shortcomings to the fore, but which also had the potential to lead to some reconciliation between these two important areas of intellectual endeavour. The topic also seemed particularly apt, being positioned as we are at the beginning of the second decade of the new century, and nearly two decades on from the advent of the (Bayesian) Markov chain Monte Carlo ‘revolution’. The two speakers were selected by the organizers because of their renowned authority in the respective fields of Bayesian and frequentist inference, with both serving to produce stimulating and lively presentations for the audience. For the purposes of publication, however, both authors have chosen to synthesize their presentations into two short, but dense, treatises on the respective paradigms. As Russell Davidson has crafted his paper in such a way that poses certain pertinent questions to the Bayesian community, we have published his paper first. Christian Robert, in addition to expounding his view of the Bayesian paradigm and the reasons for his adherence to it then addresses some of those questions. Christian Robert also plays the devil’s advocate throughout his own paper, noting criticisms that have
{"title":"‘The 21st Century Belongs to Bayes’ Debate: Introduction","authors":"G. Martin","doi":"10.15353/rea.v2i2.1468","DOIUrl":"https://doi.org/10.15353/rea.v2i2.1468","url":null,"abstract":"The primary aim of the Rimini debate was to highlight the value of both the Bayesian and frequentist (classical) paradigms, and the contributions that both make to statistical practice in the applied sciences. That said, I (the organizers?) did not wish the exercise to be a sanitized one, with the shortcomings of both methods also to be confronted. Hence, a topic was chosen that was provocative enough to bring those shortcomings to the fore, but which also had the potential to lead to some reconciliation between these two important areas of intellectual endeavour. The topic also seemed particularly apt, being positioned as we are at the beginning of the second decade of the new century, and nearly two decades on from the advent of the (Bayesian) Markov chain Monte Carlo ‘revolution’. The two speakers were selected by the organizers because of their renowned authority in the respective fields of Bayesian and frequentist inference, with both serving to produce stimulating and lively presentations for the audience. For the purposes of publication, however, both authors have chosen to synthesize their presentations into two short, but dense, treatises on the respective paradigms. As Russell Davidson has crafted his paper in such a way that poses certain pertinent questions to the Bayesian community, we have published his paper first. Christian Robert, in addition to expounding his view of the Bayesian paradigm and the reasons for his adherence to it then addresses some of those questions. Christian Robert also plays the devil’s advocate throughout his own paper, noting criticisms that have","PeriodicalId":42350,"journal":{"name":"Review of Economic Analysis","volume":"1 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2010-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66903001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}