首页 > 最新文献

Journal of Operational Risk最新文献

英文 中文
Optimal B-Robust Posterior Distributions for Operational Risk 操作风险的最优b稳健后验分布
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2016-04-30 DOI: 10.21314/jop.2016.182
Ivan Luciano Danesi, Fabio Piacenza, E. Ruli, L. Ventura
One of the aims of operational risk modelling is to generate sound and reliable quantifications of the risk exposure, including a level of volatility that is consistent with the changes of the risk profile. One way for assuring this is by means of robust procedures, such as Optimal B-Robust estimating equations. In banking practice more than one dataset should be incorporated in the risk modelling and a coherent way to proceed to such a data integration is via Bayesian procedures. However, Bayesian inference via estimating equations in general is problematic since the likelihood function is not available. We illustrate that this issue can be dealt with using approximate Bayesian computation methods with the robust estimating function as a summary of the data. The method is illustrated by a real dataset.
操作风险建模的目标之一是对风险敞口进行合理可靠的量化,包括与风险概况变化相一致的波动水平。保证这一点的一种方法是通过鲁棒程序,如最优b -鲁棒估计方程。在银行业实践中,风险建模中应该包含多个数据集,而进行这种数据整合的连贯方法是通过贝叶斯程序。然而,由于似然函数不可用,通常通过估计方程进行贝叶斯推理是有问题的。我们说明了这个问题可以用近似贝叶斯计算方法来处理,并以鲁棒估计函数作为数据的摘要。该方法以一个实际数据集为例进行了说明。
{"title":"Optimal B-Robust Posterior Distributions for Operational Risk","authors":"Ivan Luciano Danesi, Fabio Piacenza, E. Ruli, L. Ventura","doi":"10.21314/jop.2016.182","DOIUrl":"https://doi.org/10.21314/jop.2016.182","url":null,"abstract":"One of the aims of operational risk modelling is to generate sound and reliable quantifications of the risk exposure, including a level of volatility that is consistent with the changes of the risk profile. One way for assuring this is by means of robust procedures, such as Optimal B-Robust estimating equations. In banking practice more than one dataset should be incorporated in the risk modelling and a coherent way to proceed to such a data integration is via Bayesian procedures. However, Bayesian inference via estimating equations in general is problematic since the likelihood function is not available. We illustrate that this issue can be dealt with using approximate Bayesian computation methods with the robust estimating function as a summary of the data. The method is illustrated by a real dataset.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"34 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2016-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89607952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
A Simulation Comparison of Quantile Approximation Techniques for Compound Distributions Popular in Operational Risk 操作风险中常用的复合分布的分位数近似技术的模拟比较
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2016-03-21 DOI: 10.21314/JOP.2016.171
Riaan de Jongh, T. de Wet, K. Panman, H. Raubenheimer
Many banks currently use the loss distribution approach (LDA) for estimating economic and regulatory capital for operational risk under Basel's advanced measurement approach. The LDA requires the modeling of the aggregate loss distribution in each operational risk category (ORC), among others. The aggregate loss distribution is a compound distribution resulting from a random sum of losses, where the losses are distributed according to some severity distribution, and the number (of losses) is distributed according to some frequency distribution. In order to estimate the economic or regulatory capital in a particular ORC, an extreme quantile of the aggregate loss distribution has to be estimated from the fitted severity and frequency distributions. Since a closed-form expression for the quantiles of the resulting estimated compound distribution does not exist, the quantile is usually approximated using a brute force Monte Carlo simulation, which is computationally intensive. However, a number of numerical approximation techniques have been proposed to lessen the computational burden. Such techniques include Panjer recursion, the fast Fourier transform and different orders of both the single-loss approximation and perturbative approximation. The objective of this paper is to compare these methods in terms of their practical usefulness and potential applicability in an operational risk context. We find that the second-order perturbative approximation, a closed-form approximation, performs very well at the extreme quantiles and over a wide range of distributions, and it is very easy to implement. This approximation can then be used as an input to the recursive fast Fourier algorithm to gain further improvements at the less extreme quantiles.
许多银行目前使用损失分配法(LDA)来估计巴塞尔先进测量方法下的操作风险的经济和监管资本。LDA需要对每个操作风险类别(ORC)中的总损失分布进行建模。总损失分布是损失随机和的复合分布,损失按某种严重分布分布,损失数按某种频率分布分布。为了估计特定ORC中的经济或监管资本,必须从拟合的严重程度和频率分布中估计总损失分布的极端分位数。由于所得到的估计复合分布的分位数的封闭形式表达式不存在,所以通常使用蛮力蒙特卡罗模拟来近似分位数,这是计算密集型的。然而,已经提出了一些数值近似技术来减轻计算负担。这些技术包括Panjer递归、快速傅立叶变换以及单损失近似和微扰近似的不同阶数。本文的目的是比较这些方法在操作风险背景下的实际用途和潜在适用性。我们发现二阶微扰近似,一种封闭形式的近似,在极端分位数和大范围的分布上表现得很好,并且很容易实现。然后,这个近似可以用作递归快速傅立叶算法的输入,以在不太极端的分位数处获得进一步的改进。
{"title":"A Simulation Comparison of Quantile Approximation Techniques for Compound Distributions Popular in Operational Risk","authors":"Riaan de Jongh, T. de Wet, K. Panman, H. Raubenheimer","doi":"10.21314/JOP.2016.171","DOIUrl":"https://doi.org/10.21314/JOP.2016.171","url":null,"abstract":"Many banks currently use the loss distribution approach (LDA) for estimating economic and regulatory capital for operational risk under Basel's advanced measurement approach. The LDA requires the modeling of the aggregate loss distribution in each operational risk category (ORC), among others. The aggregate loss distribution is a compound distribution resulting from a random sum of losses, where the losses are distributed according to some severity distribution, and the number (of losses) is distributed according to some frequency distribution. In order to estimate the economic or regulatory capital in a particular ORC, an extreme quantile of the aggregate loss distribution has to be estimated from the fitted severity and frequency distributions. Since a closed-form expression for the quantiles of the resulting estimated compound distribution does not exist, the quantile is usually approximated using a brute force Monte Carlo simulation, which is computationally intensive. However, a number of numerical approximation techniques have been proposed to lessen the computational burden. Such techniques include Panjer recursion, the fast Fourier transform and different orders of both the single-loss approximation and perturbative approximation. The objective of this paper is to compare these methods in terms of their practical usefulness and potential applicability in an operational risk context. We find that the second-order perturbative approximation, a closed-form approximation, performs very well at the extreme quantiles and over a wide range of distributions, and it is very easy to implement. This approximation can then be used as an input to the recursive fast Fourier algorithm to gain further improvements at the less extreme quantiles.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"15 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2016-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86145191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Application of the Convolution Operator for Scenario Integration with Loss Data in Operational Risk Modeling 含损失数据场景集成的卷积算子在操作风险建模中的应用
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2015-11-13 DOI: 10.21314/jop.2015.168
Pavan Aroda, A. Guergachi, Huaxiong Huang
When using the advanced measurement approach to determine required regulatory capital for operational risk, expert opinion is applied via scenario analysis to help quantify exposure to high-severity events. A methodology is presented that makes use of the convolution operator to integrate scenarios into a baseline model. Using a baseline loss distribution model calibrated on historical losses and a scenario-derived loss distribution calibrated on scenario data points, the addition of both random processes equates to the convolution of the corresponding densities. Using an analogy from digital signal processing, the commutative property of convolution allows one function to smooth and average the other. The inherent uncertainty in scenario analysis has caused concern amongst practitioners when too much emphasis has been placed on absolutes in terms of quantified frequency/severity estimates. This method addresses this uncertainty and produces a combined loss distribution that takes information from the entire domain of the calibrated scenario distribution. The necessary theory is provided within and an example is shown to provide context.
当使用先进的测量方法来确定操作风险所需的监管资本时,通过场景分析应用专家意见来帮助量化高严重性事件的风险。提出了一种利用卷积算子将场景集成到基线模型中的方法。使用基于历史损失校准的基线损失分布模型和基于场景数据点校准的情景衍生损失分布,这两个随机过程的相加等于相应密度的卷积。利用数字信号处理的类比,卷积的交换性允许一个函数平滑和平均另一个函数。场景分析中固有的不确定性已经引起了实践者的关注,因为在量化频率/严重程度估计方面过分强调绝对。该方法解决了这种不确定性,并产生了一个综合损失分布,该分布从校准情景分布的整个域获取信息。本书提供了必要的理论,并给出了一个例子来提供上下文。
{"title":"Application of the Convolution Operator for Scenario Integration with Loss Data in Operational Risk Modeling","authors":"Pavan Aroda, A. Guergachi, Huaxiong Huang","doi":"10.21314/jop.2015.168","DOIUrl":"https://doi.org/10.21314/jop.2015.168","url":null,"abstract":"When using the advanced measurement approach to determine required regulatory capital for operational risk, expert opinion is applied via scenario analysis to help quantify exposure to high-severity events. A methodology is presented that makes use of the convolution operator to integrate scenarios into a baseline model. Using a baseline loss distribution model calibrated on historical losses and a scenario-derived loss distribution calibrated on scenario data points, the addition of both random processes equates to the convolution of the corresponding densities. Using an analogy from digital signal processing, the commutative property of convolution allows one function to smooth and average the other. The inherent uncertainty in scenario analysis has caused concern amongst practitioners when too much emphasis has been placed on absolutes in terms of quantified frequency/severity estimates. This method addresses this uncertainty and produces a combined loss distribution that takes information from the entire domain of the calibrated scenario distribution. The necessary theory is provided within and an example is shown to provide context.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"22 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76952795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Mitigating Rogue-Trading Behavior by Means of Appropriate, Effective Operational Risk Management 通过适当、有效的操作风险管理减少流氓交易行为
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2015-09-25 DOI: 10.21314/JOP.2015.162
S. Rick, Gerrit Jan van den Brink
This paper discusses the violation of applicable firm guidelines by individuals employed by a bank or financial institution and suggests specific metrics to identify and prevent such behavior by means of appropriate, effective operational risk management. Since the actor is usually socially inconspicuous, and since the associated financial damage does not necessarily have to be verifiable through classic valuation methods (e.g. financial statements), we feel that it is very difficult for banks and financial institutions to uncover such behavior. Nevertheless, in order to be able to react to this latent risk, we apply modern, basic criminological assumptions to analyse the relationship between the multiple causes of the risk and their effects in the underlying risk origination process. The analysis is performed based on Schneider's model, which is used to describe the criminal behavior of socially inconspicuous individuals. Based on the result of that analysis, we design a specific conceptual risk indicator that approximates the underlying risk exposure by means of a linear function. We then operate the developed risk indicators through a dashboard, tracking the development of each valid indicator value through time. The effectiveness of the measures taken to counteract the risk can be derived from the development of the displayed indicator value and the related trend.
本文讨论了银行或金融机构雇用的个人违反适用公司准则的情况,并提出了通过适当、有效的操作风险管理来识别和防止此类行为的具体指标。由于行动者通常在社会上不引人注目,并且由于相关的财务损失不一定必须通过经典的估值方法(例如财务报表)进行验证,我们认为银行和金融机构很难发现这种行为。然而,为了能够对这种潜在风险做出反应,我们应用现代的基本犯罪学假设来分析风险的多种原因及其在潜在风险起源过程中的影响之间的关系。分析是基于施耐德的模型进行的,该模型用于描述社会上不显眼的个人的犯罪行为。基于分析结果,我们设计了一个特定的概念性风险指标,该指标通过线性函数近似潜在风险暴露。然后,我们通过仪表板操作开发的风险指标,跟踪每个有效指标值随时间的发展。所采取的应对风险措施的有效性可以从显示的指标值的发展和相关趋势中得出。
{"title":"Mitigating Rogue-Trading Behavior by Means of Appropriate, Effective Operational Risk Management","authors":"S. Rick, Gerrit Jan van den Brink","doi":"10.21314/JOP.2015.162","DOIUrl":"https://doi.org/10.21314/JOP.2015.162","url":null,"abstract":"This paper discusses the violation of applicable firm guidelines by individuals employed by a bank or financial institution and suggests specific metrics to identify and prevent such behavior by means of appropriate, effective operational risk management. Since the actor is usually socially inconspicuous, and since the associated financial damage does not necessarily have to be verifiable through classic valuation methods (e.g. financial statements), we feel that it is very difficult for banks and financial institutions to uncover such behavior. Nevertheless, in order to be able to react to this latent risk, we apply modern, basic criminological assumptions to analyse the relationship between the multiple causes of the risk and their effects in the underlying risk origination process. The analysis is performed based on Schneider's model, which is used to describe the criminal behavior of socially inconspicuous individuals. Based on the result of that analysis, we design a specific conceptual risk indicator that approximates the underlying risk exposure by means of a linear function. We then operate the developed risk indicators through a dashboard, tracking the development of each valid indicator value through time. The effectiveness of the measures taken to counteract the risk can be derived from the development of the displayed indicator value and the related trend.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"53 ","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72420150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Bayesian Operational Risk Models 贝叶斯操作风险模型
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2015-06-08 DOI: 10.21314/JOP.2015.155
Silvia Figini, Lijun Gao, Paolo Giudici
Operational risk is hard to quantify, for the presence of heavy tailed loss distributions. Extreme value distributions, used in this context, are very sensitive to the data, and this is a problem in the presence of rare loss data. Self risk assessment questionnaires, if properly modelled, may provide the missing piece of information that is necessary to adequately estimate op- erational risks. In this paper we propose to embody self risk assessment data into suitable prior distributions, and to follow a Bayesian approach to merge self assessment with loss data. We derive operational loss posterior distribu- tions, from which appropriate measures of risk, such as the Value at Risk, or the Expected Shortfall, can be derived. We test our proposed models on a real database, made up of internal loss data and self risk assessment questionnaires of an anonymous commercial bank. Our results show that the proposed Bayesian models performs better with respect to classical extreme value models, leading to a smaller quantification of the Value at Risk required to cover unexpected losses.
由于存在重尾损失分布,操作风险很难量化。在这种情况下使用的极值分布对数据非常敏感,这在存在罕见丢失数据的情况下是一个问题。自我风险评估问卷,如果适当建模,可以提供必要的信息,以充分估计操作风险的缺失部分。在本文中,我们提出将自我风险评估数据体现到合适的先验分布中,并遵循贝叶斯方法将自我评估与损失数据合并。我们推导出经营损失后验分布,从中可以推导出适当的风险度量,如风险价值或预期损失。我们在一个真实的数据库上测试了我们提出的模型,该数据库由一家匿名商业银行的内部损失数据和自我风险评估问卷组成。我们的研究结果表明,与经典的极值模型相比,所提出的贝叶斯模型表现得更好,从而可以更小地量化覆盖意外损失所需的风险值。
{"title":"Bayesian Operational Risk Models","authors":"Silvia Figini, Lijun Gao, Paolo Giudici","doi":"10.21314/JOP.2015.155","DOIUrl":"https://doi.org/10.21314/JOP.2015.155","url":null,"abstract":"Operational risk is hard to quantify, for the presence of heavy tailed loss distributions. Extreme value distributions, used in this context, are very sensitive to the data, and this is a problem in the presence of rare loss data. Self risk assessment questionnaires, if properly modelled, may provide the missing piece of information that is necessary to adequately estimate op- erational risks. In this paper we propose to embody self risk assessment data into suitable prior distributions, and to follow a Bayesian approach to merge self assessment with loss data. We derive operational loss posterior distribu- tions, from which appropriate measures of risk, such as the Value at Risk, or the Expected Shortfall, can be derived. We test our proposed models on a real database, made up of internal loss data and self risk assessment questionnaires of an anonymous commercial bank. Our results show that the proposed Bayesian models performs better with respect to classical extreme value models, leading to a smaller quantification of the Value at Risk required to cover unexpected losses.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"38 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87460323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Shapley Allocation, Diversification and Services in Operational Risk 操作风险中的Shapley配置、分散与服务
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2015-06-01 DOI: 10.21314/jop.2018.205
P. Mitic, Bertrand K. Hassani
A method of allocating Operational Risk regulatory capital using the Shapley method for a large number of business units, supported by a service, is proposed. A closed-form formula for Shapley allocations is developed under two principal assumptions. First, if business units form coalitions, the value added to the coalition by a new entrant depends on a constant proportionality factor. This factor represents the diversification that can be achieved by combining operational risk losses. Second, that the service should reduce the capital payable by business units, and that this reduction is calculated as an integral part of the allocation process. We ensure that allocations of capital charges are acceptable to and are understandable by both risk and senior managers. The results derived are applied to recent loss data.
提出了一种基于Shapley方法的操作风险监管资本分配方法,该方法适用于由服务支持的大量业务单元。在两个主要假设下,导出了Shapley分配的封闭公式。首先,如果业务单位形成联盟,新进入者给联盟增加的价值取决于一个恒定的比例因子。这个因素代表了通过合并操作风险损失可以实现的多样化。第二,服务应该减少业务单位应付的资金,并且这种减少是作为分配过程的一个组成部分来计算的。我们确保资本费用的分配是风险经理和高级经理都能接受和理解的。所得结果应用于最近的损失数据。
{"title":"Shapley Allocation, Diversification and Services in Operational Risk","authors":"P. Mitic, Bertrand K. Hassani","doi":"10.21314/jop.2018.205","DOIUrl":"https://doi.org/10.21314/jop.2018.205","url":null,"abstract":"A method of allocating Operational Risk regulatory capital using the Shapley method for a large number of business units, supported by a service, is proposed. A closed-form formula for Shapley allocations is developed under two principal assumptions. First, if business units form coalitions, the value added to the coalition by a new entrant depends on a constant proportionality factor. This factor represents the diversification that can be achieved by combining operational risk losses. Second, that the service should reduce the capital payable by business units, and that this reduction is calculated as an integral part of the allocation process. We ensure that allocations of capital charges are acceptable to and are understandable by both risk and senior managers. The results derived are applied to recent loss data.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"129 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73069919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Improved Goodness-of-Fit Measures 改进的适合度测量
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2015-03-30 DOI: 10.21314/JOP.2015.159
P. Mitic
New goodness-of-fit measures which are significant improvements on existing measures are described. They use the intuitive geometrical concept of the area enclosed by the curve of a fitted distribution and the profile of the empirical cumulative distribution function.A transformation of this profile simplifies the geometry and provides three new goodness-of-fit tests. The integrity of this transformation is justified by topological arguments. The new tests provide a quantitative justification for qualitative judgements on goodness-of-fit, are independent of population size and provide a workable way to objectively choose a best fit distribution from a group of candidate distributions.
描述了新的拟合优度测度,这些测度是对现有测度的重大改进。他们使用由拟合分布曲线和经验累积分布函数的轮廓所包围的面积的直观几何概念。该轮廓的变换简化了几何形状,并提供了三种新的拟合优度测试。这种转换的完整性是通过拓扑论证来证明的。新的检验为对拟合优度的定性判断提供了定量依据,与总体大小无关,并提供了一种从一组候选分布中客观选择最佳拟合分布的可行方法。
{"title":"Improved Goodness-of-Fit Measures","authors":"P. Mitic","doi":"10.21314/JOP.2015.159","DOIUrl":"https://doi.org/10.21314/JOP.2015.159","url":null,"abstract":"New goodness-of-fit measures which are significant improvements on existing measures are described. They use the intuitive geometrical concept of the area enclosed by the curve of a fitted distribution and the profile of the empirical cumulative distribution function.A transformation of this profile simplifies the geometry and provides three new goodness-of-fit tests. The integrity of this transformation is justified by topological arguments. The new tests provide a quantitative justification for qualitative judgements on goodness-of-fit, are independent of population size and provide a workable way to objectively choose a best fit distribution from a group of candidate distributions.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"45 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75518679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
A Checklist-Based Weighted Fuzzy Severity Approach for Calculating Operational Risk Exposure on Foreign Exchange Trades Under the Basel II Regime 基于清单的加权模糊严重程度法计算巴塞尔协议II下外汇交易操作风险
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2014-12-19 DOI: 10.21314/jop.2014.136
V. Sree Hari Rao, K. Ramesh
It is well-known that any risk management activity is a cost to the organization. However, optimized risk management practices satisfy regulatory capital requirements and gain the confidence of investors who take calculated risks. A bank’s risk management division will generate a profit if it can develop methodologies to decrease the nonworking regulatory capital. This may be achieved only when the risks are measured using data from internal and external sources in conjunction with scenario analysis. One such method of measuring operational risk (OR) is the advanced measurement approach. This involves quantifying ORs across the various nodes within a bank following the loss distribution approach, in which the frequency and severity distributions of the loss-generating OR events are estimated from the data sources. These distributions are then used to generate the scenarios for frequency and its associated severity for estimating the OR capital. In our approach, the various levels of loss severity are mapped to a percentage of total trade exposure, and the occurrence frequency of an OR event is assumed to follow a binomial distribution.
众所周知,任何风险管理活动都是组织的成本。然而,优化的风险管理实践满足了监管资本要求,并获得了承担可计算风险的投资者的信心。如果银行的风险管理部门能够开发出减少非营运监管资本的方法,它将产生利润。只有在使用来自内部和外部来源的数据并结合情景分析来衡量风险时,才能做到这一点。其中一种测量操作风险(OR)的方法是高级测量方法。这涉及根据损失分布方法对银行内各个节点的OR进行量化,其中从数据源估计产生损失的OR事件的频率和严重程度分布。然后使用这些分布来生成频率及其相关严重性的场景,以估计OR资本。在我们的方法中,不同级别的损失严重程度被映射为总交易风险的百分比,并且假设OR事件的发生频率遵循二项分布。
{"title":"A Checklist-Based Weighted Fuzzy Severity Approach for Calculating Operational Risk Exposure on Foreign Exchange Trades Under the Basel II Regime","authors":"V. Sree Hari Rao, K. Ramesh","doi":"10.21314/jop.2014.136","DOIUrl":"https://doi.org/10.21314/jop.2014.136","url":null,"abstract":"It is well-known that any risk management activity is a cost to the organization. However, optimized risk management practices satisfy regulatory capital requirements and gain the confidence of investors who take calculated risks. A bank’s risk management division will generate a profit if it can develop methodologies to decrease the nonworking regulatory capital. This may be achieved only when the risks are measured using data from internal and external sources in conjunction with scenario analysis. One such method of measuring operational risk (OR) is the advanced measurement approach. This involves quantifying ORs across the various nodes within a bank following the loss distribution approach, in which the frequency and severity distributions of the loss-generating OR events are estimated from the data sources. These distributions are then used to generate the scenarios for frequency and its associated severity for estimating the OR capital. In our approach, the various levels of loss severity are mapped to a percentage of total trade exposure, and the occurrence frequency of an OR event is assumed to follow a binomial distribution.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"15 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2014-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91019063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Dissecting the JPMorgan Whale: A Post-Mortem 解剖摩根大通鲸鱼:一个事后解剖
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2014-06-30 DOI: 10.21314/JOP.2014.144
P. Mcconnell
In many respects, the “London whale” scandal at JPMorgan Chase is similar to other “rogue trading” events, in that a group of traders took large, speculative positions in complex derivative securities that went wrong, resulting in over US$6 billion of trading losses to the firm. As in other rogue trading cases, there were desperate attempts to cover up the losses until they became too big to ignore and eventually had to be recognized in the financial accounts of the bank. However, the whale case, so-called because of the sheer size of the trading positions involved, differs in several important respects from other rogue trading cases, not least because the sheer size and riskiness of the positions were well-known to many executives within JPMorgan, a firm that prided itself on having advanced risk management capabilities and systems. The role of Model Risk in this scandal, while not the primary cause, is important in that at least part of the impetus to take huge positions was due to incorrect risk modeling. Various external and internal inquiries into the events have concluded that critical risk management processes in the bank broke down, not only in the Chief Investment Office, the division in which the losses occurred, but across the bank. In particular, deficiencies in the firm’s Model Development and Approval processes allowed traders to trade while underestimating the risks that they were running. Under Basel II regulations, losses due to process failure are classified as operational risk losses and hence this case demonstrates a significant failure of operational risk management in JPMorgan. This paper dissects the whale scandal from an operational risk perspective using the late Professor Barry Turner’s framework for analyzing organizational disasters. The paper also makes suggestions as to how model risk may be managed to prevent similar losses in future.
在许多方面,摩根大通的“伦敦鲸”丑闻与其他“流氓交易”事件类似,因为一群交易员在复杂的衍生品证券中持有大量投机性头寸,这些头寸出现了问题,给该公司造成了超过60亿美元的交易损失。与其他流氓交易案件一样,人们不顾一切地试图掩盖损失,直到损失大到无法忽视,最终不得不在银行的财务账户中得到确认。然而,“鲸鱼案”(因所涉及交易头寸的庞大规模而被称为“鲸鱼案”)在几个重要方面不同于其他流氓交易案件,尤其是因为这些头寸的庞大规模和风险为摩根大通的许多高管所熟知。摩根大通以拥有先进的风险管理能力和系统而自豪。模型风险在这起丑闻中的作用,虽然不是主要原因,但很重要,因为至少部分推动巨额头寸的动力是由于不正确的风险建模。对这些事件进行的各种外部和内部调查得出的结论是,该行关键的风险管理流程出现了故障,不仅是首席投资办公室(Chief Investment Office)——损失发生的部门,整个银行都出现了故障。特别是,该公司的模型开发和批准流程存在缺陷,这使得交易员在交易时低估了他们所面临的风险。根据巴塞尔协议II的规定,由于流程失败造成的损失被归类为操作风险损失,因此这个案例表明了摩根大通操作风险管理的重大失败。本文采用已故巴里·特纳教授分析组织灾难的框架,从操作风险的角度剖析了鲸鱼丑闻。本文还就如何管理模型风险以防止未来发生类似损失提出了建议。
{"title":"Dissecting the JPMorgan Whale: A Post-Mortem","authors":"P. Mcconnell","doi":"10.21314/JOP.2014.144","DOIUrl":"https://doi.org/10.21314/JOP.2014.144","url":null,"abstract":"In many respects, the “London whale” scandal at JPMorgan Chase is similar to other “rogue trading” events, in that a group of traders took large, speculative positions in complex derivative securities that went wrong, resulting in over US$6 billion of trading losses to the firm. As in other rogue trading cases, there were desperate attempts to cover up the losses until they became too big to ignore and eventually had to be recognized in the financial accounts of the bank. However, the whale case, so-called because of the sheer size of the trading positions involved, differs in several important respects from other rogue trading cases, not least because the sheer size and riskiness of the positions were well-known to many executives within JPMorgan, a firm that prided itself on having advanced risk management capabilities and systems. The role of Model Risk in this scandal, while not the primary cause, is important in that at least part of the impetus to take huge positions was due to incorrect risk modeling. Various external and internal inquiries into the events have concluded that critical risk management processes in the bank broke down, not only in the Chief Investment Office, the division in which the losses occurred, but across the bank. In particular, deficiencies in the firm’s Model Development and Approval processes allowed traders to trade while underestimating the risks that they were running. Under Basel II regulations, losses due to process failure are classified as operational risk losses and hence this case demonstrates a significant failure of operational risk management in JPMorgan. This paper dissects the whale scandal from an operational risk perspective using the late Professor Barry Turner’s framework for analyzing organizational disasters. The paper also makes suggestions as to how model risk may be managed to prevent similar losses in future.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"61 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2014-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74407194","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Specification Test for Threshold Estimation in Extreme Value Theory 极值理论中阈值估计的规范检验
IF 0.5 4区 经济学 Q4 BUSINESS, FINANCE Pub Date : 2014-06-30 DOI: 10.21314/JOP.2014.145
L. C. Miranda
A fundamental component in the modeling of a financial risk exposure is the estimation of the probability distribution function that best describes the true data-generation process of independent and extreme loss events that fall above a certain threshold. In this paper, we assume that, above the threshold, the extreme loss events are explained by an extreme value distribution. For that purpose, we apply the classical peaks-over-threshold method in extreme-value statistics. According to that approach, data in excess of a certain threshold is asymptotically described by a generalized Pareto distribution (GPD). Consequently, establishing a mechanism to estimate this threshold is of major importance. The current methods to estimate the thresholds are based on a subjective inspection of mean excess plots or other statistical measures; the Hill estimator, for example, leads to an undesirable level of subjectivity. In this paper, we propose an innovative mechanism that increases the level of objectivity of threshold selection, departing from a subjective and imprecise eyeballing of charts. The proposed algorithm is based on the properties of the generalized Pareto distribution and considers the choice of threshold to be an important modeling decision that can have significant impact on the model outcomes. The algorithm we introduce here is based on the Hausman specification test to determine the threshold, which maintains proper specification so that the other parameters of the distribution can be estimated without compromising the balance between bias and variance. We apply the test to real risk data so that we can obtain a practical example of the improvements the process will bring. Results show that the Hausman test is a valid mechanism for estimating the GPD threshold and can be seen as a relevant enhancement in the objectivity of the entire process.
金融风险暴露建模的一个基本组成部分是对概率分布函数的估计,该函数最好地描述了超过某一阈值的独立和极端损失事件的真实数据生成过程。在本文中,我们假设在阈值以上,极端损失事件可以用极值分布来解释。为此,我们在极值统计中应用经典的峰值超过阈值方法。根据该方法,超过一定阈值的数据用广义帕累托分布(GPD)渐近描述。因此,建立一个机制来估计这个阈值是非常重要的。目前估计阈值的方法是基于对平均超额地块或其他统计措施的主观检查;例如,希尔估计器导致了不受欢迎的主观性水平。在本文中,我们提出了一种创新机制,可以提高阈值选择的客观性,从而摆脱对图表的主观和不精确的关注。该算法基于广义Pareto分布的特性,将阈值的选择作为对模型结果有重要影响的重要建模决策。我们在这里介绍的算法是基于Hausman规格检验来确定阈值的,它保持了适当的规格,以便可以估计分布的其他参数,而不会影响偏差和方差之间的平衡。我们将测试应用于真实的风险数据,以便我们可以获得该过程将带来的改进的实际示例。结果表明,Hausman检验是估计GPD阈值的有效机制,可以看作是整个过程客观性的相关增强。
{"title":"Specification Test for Threshold Estimation in Extreme Value Theory","authors":"L. C. Miranda","doi":"10.21314/JOP.2014.145","DOIUrl":"https://doi.org/10.21314/JOP.2014.145","url":null,"abstract":"A fundamental component in the modeling of a financial risk exposure is the estimation of the probability distribution function that best describes the true data-generation process of independent and extreme loss events that fall above a certain threshold. In this paper, we assume that, above the threshold, the extreme loss events are explained by an extreme value distribution. For that purpose, we apply the classical peaks-over-threshold method in extreme-value statistics. According to that approach, data in excess of a certain threshold is asymptotically described by a generalized Pareto distribution (GPD). Consequently, establishing a mechanism to estimate this threshold is of major importance. The current methods to estimate the thresholds are based on a subjective inspection of mean excess plots or other statistical measures; the Hill estimator, for example, leads to an undesirable level of subjectivity. In this paper, we propose an innovative mechanism that increases the level of objectivity of threshold selection, departing from a subjective and imprecise eyeballing of charts. The proposed algorithm is based on the properties of the generalized Pareto distribution and considers the choice of threshold to be an important modeling decision that can have significant impact on the model outcomes. The algorithm we introduce here is based on the Hausman specification test to determine the threshold, which maintains proper specification so that the other parameters of the distribution can be estimated without compromising the balance between bias and variance. We apply the test to real risk data so that we can obtain a practical example of the improvements the process will bring. Results show that the Hausman test is a valid mechanism for estimating the GPD threshold and can be seen as a relevant enhancement in the objectivity of the entire process.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"35 1","pages":"23-37"},"PeriodicalIF":0.5,"publicationDate":"2014-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89028079","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
期刊
Journal of Operational Risk
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1