We extend a technique devised by Saroka and Rebonato to “optimally” deform a yield curve in order to deal with a common and practically relevant class of optimization problems subject to linear constraints. In particular, we show how the idea can be applied to the case of reverse stress testing, and we present a case study to illustrate how it works. Finally, we point out a maximum-entropy interpretation of (or justification for) the procedure and present some obvious generalizations.
{"title":"The Quickest Way to Lose the Money You Cannot Afford to Lose: Reverse Stress Testing With Maximum Entropy","authors":"R. Rebonato","doi":"10.21314/JOR.2018.369","DOIUrl":"https://doi.org/10.21314/JOR.2018.369","url":null,"abstract":"We extend a technique devised by Saroka and Rebonato to “optimally” deform a yield curve in order to deal with a common and practically relevant class of optimization problems subject to linear constraints. In particular, we show how the idea can be applied to the case of reverse stress testing, and we present a case study to illustrate how it works. Finally, we point out a maximum-entropy interpretation of (or justification for) the procedure and present some obvious generalizations.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":"1 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2018-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43119240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The evolution of risk management has resulted from the interplay of financial crises, risk management practices, and regulatory actions. In the 1970s, research lay the intellectual foundations for the risk management practices that were systematically implemented in the 1980s as bond trading revolutionized Wall Street. Quants developed dynamic hedging, Value-at-Risk, and credit risk models based on the insights of financial economics. In parallel, the Basel I framework created a level playing field among banks across countries. Following the 1987 stock market crash, the near failure of Salomon Brothers, and the failure of Drexel Burnham Lambert, in 1996 the Basel Committee on Banking Supervision published the Market Risk Amendment to the Basel I Capital Accord; the amendment went into effect in 1998. It led to a migration of bank risk management practices toward market risk regulations. The framework was further developed in the Basel II Accord, which, however, from the very beginning, was labeled as being procyclical due to the reliance of capital requirements on contemporaneous volatility estimates. Indeed, the failure to measure and manage risk adequately can be viewed as a key contributor to the 2008 global financial crisis. Subsequent innovations in risk management practices have been dominated by regulatory innovations, including capital and liquidity stress testing, macroprudential surcharges, resolution regimes, and countercyclical capital requirements.
{"title":"Risk Management and Regulation","authors":"T. Adrian","doi":"10.21314/JOR.2017.396","DOIUrl":"https://doi.org/10.21314/JOR.2017.396","url":null,"abstract":"The evolution of risk management has resulted from the interplay of financial crises, risk management practices, and regulatory actions. In the 1970s, research lay the intellectual foundations for the risk management practices that were systematically implemented in the 1980s as bond trading revolutionized Wall Street. Quants developed dynamic hedging, Value-at-Risk, and credit risk models based on the insights of financial economics. In parallel, the Basel I framework created a level playing field among banks across countries. Following the 1987 stock market crash, the near failure of Salomon Brothers, and the failure of Drexel Burnham Lambert, in 1996 the Basel Committee on Banking Supervision published the Market Risk Amendment to the Basel I Capital Accord; the amendment went into effect in 1998. It led to a migration of bank risk management practices toward market risk regulations. The framework was further developed in the Basel II Accord, which, however, from the very beginning, was labeled as being procyclical due to the reliance of capital requirements on contemporaneous volatility estimates. Indeed, the failure to measure and manage risk adequately can be viewed as a key contributor to the 2008 global financial crisis. Subsequent innovations in risk management practices have been dominated by regulatory innovations, including capital and liquidity stress testing, macroprudential surcharges, resolution regimes, and countercyclical capital requirements.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":" ","pages":""},"PeriodicalIF":0.7,"publicationDate":"2017-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48584120","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Multivariate volatility models can be evaluated via direct and indirect approaches. The former uses statistical loss functions (LFs) and a proxy to provide consistent estimates of the unobserved volatility. The latter uses utility LFs or other instruments, such as value-at-risk and its backtesting procedures. Existing studies commonly employ these procedures separately, focusing mostly on the multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) models. This work investigates and compares the two approaches in a model selection context. An extensive Monte Carlo simulation experiment is carried out, including MGARCH models based on daily returns and, extending the current literature, models that directly use the realized covariance, obtained from intraday returns. With reference to the direct approach, we rank the set of competing models empirically by means of four consistent statistical LFs and by reducing the quality of the volatility proxy. For the indirect approach, we use standard backtesting procedures to evaluate whether the number of value-at-risk violations is acceptable, and whether these violations are independently distributed over time.
{"title":"Comparing Multivariate Volatility Forecasts by Direct and Indirect Approaches","authors":"Alessandra Amendola, V. Candila","doi":"10.21314/JOR.2017.364","DOIUrl":"https://doi.org/10.21314/JOR.2017.364","url":null,"abstract":"Multivariate volatility models can be evaluated via direct and indirect approaches. The former uses statistical loss functions (LFs) and a proxy to provide consistent estimates of the unobserved volatility. The latter uses utility LFs or other instruments, such as value-at-risk and its backtesting procedures. Existing studies commonly employ these procedures separately, focusing mostly on the multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) models. This work investigates and compares the two approaches in a model selection context. An extensive Monte Carlo simulation experiment is carried out, including MGARCH models based on daily returns and, extending the current literature, models that directly use the realized covariance, obtained from intraday returns. With reference to the direct approach, we rank the set of competing models empirically by means of four consistent statistical LFs and by reducing the quality of the volatility proxy. For the indirect approach, we use standard backtesting procedures to evaluate whether the number of value-at-risk violations is acceptable, and whether these violations are independently distributed over time.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":" ","pages":""},"PeriodicalIF":0.7,"publicationDate":"2017-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46880330","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We implement a “horse race” competition between several option-pricing models for Standard & Poor’s 500 options. We consider trader rules (the so-called ad hoc Black–Scholes model) to predict future implied volatilities by applying simple ad hoc rules, as well as mathematically complicated option-pricing models, to the observed current implied volatility patterns. The traditional rollover strategy, ie, the nearest-to-next approach, and a new rollover strategy, the next-to-next approach, are also compared for the parameters of each option-pricing model. We find that simple trader rules dominate mathematically more sophisticated models, and that the next-to-next strategy can decrease the pricing and hedging errors of all option-pricing models, unlike the nearest-to-next approach. The “absolute smile” trader rule, which assumes that the implied volatility follows a fixed function of the strike price, has the advantage of simplicity and is the best model for pricing and hedging options.
{"title":"Pricing and hedging options with rollover parameters","authors":"Sol Kim","doi":"10.21314/JOR.2017.352","DOIUrl":"https://doi.org/10.21314/JOR.2017.352","url":null,"abstract":"We implement a “horse race” competition between several option-pricing models for Standard & Poor’s 500 options. We consider trader rules (the so-called ad hoc Black–Scholes model) to predict future implied volatilities by applying simple ad hoc rules, as well as mathematically complicated option-pricing models, to the observed current implied volatility patterns. The traditional rollover strategy, ie, the nearest-to-next approach, and a new rollover strategy, the next-to-next approach, are also compared for the parameters of each option-pricing model. We find that simple trader rules dominate mathematically more sophisticated models, and that the next-to-next strategy can decrease the pricing and hedging errors of all option-pricing models, unlike the nearest-to-next approach. The “absolute smile” trader rule, which assumes that the implied volatility follows a fixed function of the strike price, has the advantage of simplicity and is the best model for pricing and hedging options.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":"1 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2017-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41613979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Most structural models for valuing corporate securities assume a geometric-Brownian motion to describe the firm’s assets value. However, this does not reflect market-stylized features; the default is more often conducted by sudden informations and shocks, which are not captured by the Gaussian model assumption. To remedy this, we propose a dynamic program for valuing corporate securities under various Lévy processes. Specifically, we study two jump diffusions and a pure-jump process. Under these settings, we build and experiment with a flexible framework, which accommodates the balance-sheet equality, arbitrary corporate debts, multiple seniority classes, tax benefits, and bankruptcy costs. While our approach applies to several Lévy processes, we compute and detail the equity’s, debt’s, and firm’s total values, as well as the debt’s credit-spreads under Gaussian, double exponential, and variance-gamma-jump models.
{"title":"A dynamic program under Lévy processes for valuing corporate securities","authors":"B. Rémillard, H. Ben-Ameur, R. Chérif","doi":"10.21314/jor.2022.051","DOIUrl":"https://doi.org/10.21314/jor.2022.051","url":null,"abstract":"Most structural models for valuing corporate securities assume a geometric-Brownian motion to describe the firm’s assets value. However, this does not reflect market-stylized features; the default is more often conducted by sudden informations and shocks, which are not captured by the Gaussian model assumption. To remedy this, we propose a dynamic program for valuing corporate securities under various Lévy processes. Specifically, we study two jump diffusions and a pure-jump process. Under these settings, we build and experiment with a flexible framework, which accommodates the balance-sheet equality, arbitrary corporate debts, multiple seniority classes, tax benefits, and bankruptcy costs. While our approach applies to several Lévy processes, we compute and detail the equity’s, debt’s, and firm’s total values, as well as the debt’s credit-spreads under Gaussian, double exponential, and variance-gamma-jump models.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":" ","pages":""},"PeriodicalIF":0.7,"publicationDate":"2017-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47763637","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We explore how a defined-contribution pension fund optimally distributes wealth between a defaultable bond, a stock and a bank account, given that a salary is a stochastic process. We assume that the investment objective of the defined-contribution pension fund is to maximize the expected constant relative risk aversion utility of terminal wealth. We thus obtain a closed-form solution to the optimal problem using a martingale approach. We develop numerical simulations, which we graph as illustrations. Finally, we discuss relevant economic insights obtained from our results.
{"title":"Optimal Asset Management for Defined-Contribution Pension Funds with Default Risk","authors":"Shibo Bian, James E. Cicon, Yi Zhang","doi":"10.21314/jor.2016.346","DOIUrl":"https://doi.org/10.21314/jor.2016.346","url":null,"abstract":"We explore how a defined-contribution pension fund optimally distributes wealth between a defaultable bond, a stock and a bank account, given that a salary is a stochastic process. We assume that the investment objective of the defined-contribution pension fund is to maximize the expected constant relative risk aversion utility of terminal wealth. We thus obtain a closed-form solution to the optimal problem using a martingale approach. We develop numerical simulations, which we graph as illustrations. Finally, we discuss relevant economic insights obtained from our results.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":"1 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2016-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67718719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We show how redemption risks of mutual funds can be modeled using the peaks-over-threshold approach from extreme value theory. The resulting risk measure liquidity-at-risk is adapted to cover issues arising when fund redemption data from the real world is used, and we give guidelines for what should be considered in practice. We also provide an automated and easily applicable procedure for determining the threshold parameter of a generalized Pareto distribution by means of a given data set. Moreover, we supplement our findings with a thorough backtesting analysis.
{"title":"Modeling Redemption Risks of Mutual Funds Using Extreme Value Theory","authors":"Sascha Desmettre, M. Deege","doi":"10.21314/JOR.2016.336","DOIUrl":"https://doi.org/10.21314/JOR.2016.336","url":null,"abstract":"We show how redemption risks of mutual funds can be modeled using the peaks-over-threshold approach from extreme value theory. The resulting risk measure liquidity-at-risk is adapted to cover issues arising when fund redemption data from the real world is used, and we give guidelines for what should be considered in practice. We also provide an automated and easily applicable procedure for determining the threshold parameter of a generalized Pareto distribution by means of a given data set. Moreover, we supplement our findings with a thorough backtesting analysis.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":"1 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2016-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67718643","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The decomposition of portfolio risks in terms of the underlying assets, which are extremely important for risk budgeting, asset allocation and risk monitoring, is well described by risk contributions. However, risk contributions cannot be calculated analytically for a considerable number of the risk models used in practice. We therefore study the use of finite difference methods for estimating risk contributions. We find that for practically relevant setups the additional estimation errors of the finite difference formulas are negligibly small. Since finite difference methods work for complex risk models and are independent of decisions about underlying distributions, we suggest the use of finite difference methods as the standard procedure for estimating risk contributions. As an application, we consider a general risk model that fits a kernel density estimation to the historical asset return distribution combined with a finite difference method in order to arrive at the risk contributions. It turns out that this general risk model combined with a finite difference method for calculating risk contributions works well in terms of estimation error.
{"title":"Finite Difference Methods for Estimating Marginal Risk Contributions in Asset Management","authors":"M. Olschewsky, Stefan Lüdemann, Thorsten Poddig","doi":"10.21314/JOR.2016.334","DOIUrl":"https://doi.org/10.21314/JOR.2016.334","url":null,"abstract":"The decomposition of portfolio risks in terms of the underlying assets, which are extremely important for risk budgeting, asset allocation and risk monitoring, is well described by risk contributions. However, risk contributions cannot be calculated analytically for a considerable number of the risk models used in practice. We therefore study the use of finite difference methods for estimating risk contributions. We find that for practically relevant setups the additional estimation errors of the finite difference formulas are negligibly small. Since finite difference methods work for complex risk models and are independent of decisions about underlying distributions, we suggest the use of finite difference methods as the standard procedure for estimating risk contributions. As an application, we consider a general risk model that fits a kernel density estimation to the historical asset return distribution combined with a finite difference method in order to arrive at the risk contributions. It turns out that this general risk model combined with a finite difference method for calculating risk contributions works well in terms of estimation error.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":"1 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2016-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67718417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Comparing risk measures when aggregating market risk and credit risk using different copulas","authors":"Jakob Maciag, F. Hesse, Rolf Boeve, A. Pfingsten","doi":"10.21314/JOR.2016.335","DOIUrl":"https://doi.org/10.21314/JOR.2016.335","url":null,"abstract":"","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":"18 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2016-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67718525","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pilar Abad, Sonia Benito, C. Martín, M. Sánchez-Granero
Executive summary: This paper evaluates the performance of several skewed and symmetric distributions in modeling the tail behavior of daily returns and forecasting Value at Risk (VaR). First, we used some goodness of fit tests to analyze which distribution best fits the data. The comparisons in terms of VaR have been carried out examining the accuracy of the VaR estimate and minimizing the loss function from the point of view of the regulator and the firm. The results show that the skewed distributions outperform the normal and Student-t (ST) distribution in fitting portfolio returns. Following a two-stage selection process, whereby we initially ensure that the distributions provide accurate VaR estimates and then, focusing on the firm s loss function, we can conclude that skewed distributions outperform the normal and ST distribution in forecasting VaR. From the point of view of the regulator, the superiority of the skewed distributions related to ST is not so evident. As the firms are free to choose the VaR model they use to forecast VaR, in practice, skewed distributions will be more frequently used.
{"title":"Evaluating the performance of the skewed distributions to forecast value-at-risk in the global financial crisis","authors":"Pilar Abad, Sonia Benito, C. Martín, M. Sánchez-Granero","doi":"10.21314/J0R.2016.332","DOIUrl":"https://doi.org/10.21314/J0R.2016.332","url":null,"abstract":"Executive summary: This paper evaluates the performance of several skewed and symmetric distributions in modeling the tail behavior of daily returns and forecasting Value at Risk (VaR). First, we used some goodness of fit tests to analyze which distribution best fits the data. The comparisons in terms of VaR have been carried out examining the accuracy of the VaR estimate and minimizing the loss function from the point of view of the regulator and the firm. The results show that the skewed distributions outperform the normal and Student-t (ST) distribution in fitting portfolio returns. Following a two-stage selection process, whereby we initially ensure that the distributions provide accurate VaR estimates and then, focusing on the firm s loss function, we can conclude that skewed distributions outperform the normal and ST distribution in forecasting VaR. From the point of view of the regulator, the superiority of the skewed distributions related to ST is not so evident. As the firms are free to choose the VaR model they use to forecast VaR, in practice, skewed distributions will be more frequently used.","PeriodicalId":46697,"journal":{"name":"Journal of Risk","volume":"18 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2016-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67693400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}