Gianluca Macchia, Emanuele De Angelis, Michele Vitagliano
After a short review of the MiFID regulations and the RAF, the paper identifies the link between them which allows to mitigate a balance sheet risk sustained by the financial intermediary and, at the same time, to improve its stability and value creation, through a maximization of customer loyalty. The client’s attitude towards risk can be summarized in these terms: "I don't like risk, but I like to win". Thus, a three-dimensional approach towards expected utility is suggested for estimating risk tolerance: risk aversion, loss aversion and reflection. In addition, a definition of the client's financial objectives is required, combined with greater disclosure - which allows the construction of a financial statement – to attest that risk-taking is indeed a luxury, as indicated by the metrics of the discretionary wealth ratio, and therefore, of the Standard of Living Risk (SLR). The next step consists in the determination of a set of portfolios along the efficient frontier where risk is represented by the expected shortfall, the determination of which belongs to a Generalized Extreme Value Theory logic. The client's objectives are described in terms of probability of success, where the latter is a function of an initial endowment, a potential positive contribution of financial resources over time as well as an expected return level. The above is expressed through a practical case that envisages the determination of a set of EGPF portfolios and the identification of the specific portfolio, obtained as a solution to a static and dynamic optimization problem, where the objectives have been formalized through the calculation of the associated utility
{"title":"The link between MiFID and Risk Appetite Framework as an application of best practices for wealth management and the entire value chain of the financial industry","authors":"Gianluca Macchia, Emanuele De Angelis, Michele Vitagliano","doi":"10.47473/2020rmm0134","DOIUrl":"https://doi.org/10.47473/2020rmm0134","url":null,"abstract":"After a short review of the MiFID regulations and the RAF, the paper identifies the link between them which allows to mitigate a balance sheet risk sustained by the financial intermediary and, at the same time, to improve its stability and value creation, through a maximization of customer loyalty. The client’s attitude towards risk can be summarized in these terms: \"I don't like risk, but I like to win\". Thus, a three-dimensional approach towards expected utility is suggested for estimating risk tolerance: risk aversion, loss aversion and reflection. In addition, a definition of the client's financial objectives is required, combined with greater disclosure - which allows the construction of a financial statement – to attest that risk-taking is indeed a luxury, as indicated by the metrics of the discretionary wealth ratio, and therefore, of the Standard of Living Risk (SLR). The next step consists in the determination of a set of portfolios along the efficient frontier where risk is represented by the expected shortfall, the determination of which belongs to a Generalized Extreme Value Theory logic. The client's objectives are described in terms of probability of success, where the latter is a function of an initial endowment, a potential positive contribution of financial resources over time as well as an expected return level. The above is expressed through a practical case that envisages the determination of a set of EGPF portfolios and the identification of the specific portfolio, obtained as a solution to a static and dynamic optimization problem, where the objectives have been formalized through the calculation of the associated utility","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"24 7","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139189831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
On December 2017, the Basel Committee published the “Basel III: Finalising post-crisis reforms” (also known as Basel IV) that introduces the Standardised Measurement Approach (SMA) to define the Pillar I operational risk capital requirement that is foreseen to entry into force on the 1st of January 2025, replacing all the existing approaches. This approach not only introduces a new method to be used to calculate the operational risk capital requirement but details several updates that have to be applied to the main components of the framework such as Governance, Loss Data Collection and Risk Self-Assessment. With the entry into force of the SMA, banks have the chance to fully re-think their operational risk Management Framework (ORMF) integrating the different components and making it more efficient and effective in terms of data governance, process management and reporting. This paper describes the SMA methodology to be implemented to calculate the Pillar I operational risk capital requirement and provides an overview of the expected impact on the different components of the ORMF of the banks.
2017 年 12 月,巴塞尔委员会发布了《巴塞尔协议 III:完成危机后改革》(又称《巴塞尔协议 IV》),引入了标准化计量方法(SMA)来定义第一支柱操作风险资本要求,预计将于 2025 年 1 月 1 日生效,取代所有现有方法。该方法不仅引入了用于计算操作风险资本要求的新方法,还详细说明了必须应用于治理、损失数据收集和风险自我评估等框架主要组成部分的几项更新。随着 SMA 的生效,银行有机会全面重新考虑其操作风险管理框架 (ORMF),整合不同的组成部分,使其在数据治理、流程管理和报告方面更加高效和有效。本文介绍了为计算第一支柱操作风险资本要求而实施的 SMA 方法,并概述了对银行操作风险管理框架不同组成部分的预期影响。
{"title":"Operational Risk framework and Standardised Measurement Approach (SMA)","authors":"Paolo Fabris, Alessandro Leoni, Ilaria Marfella","doi":"10.47473/2020rmm0133","DOIUrl":"https://doi.org/10.47473/2020rmm0133","url":null,"abstract":"On December 2017, the Basel Committee published the “Basel III: Finalising post-crisis reforms” (also known as Basel IV) that introduces the Standardised Measurement Approach (SMA) to define the Pillar I operational risk capital requirement that is foreseen to entry into force on the 1st of January 2025, replacing all the existing approaches. This approach not only introduces a new method to be used to calculate the operational risk capital requirement but details several updates that have to be applied to the main components of the framework such as Governance, Loss Data Collection and Risk Self-Assessment. With the entry into force of the SMA, banks have the chance to fully re-think their operational risk Management Framework (ORMF) integrating the different components and making it more efficient and effective in terms of data governance, process management and reporting. This paper describes the SMA methodology to be implemented to calculate the Pillar I operational risk capital requirement and provides an overview of the expected impact on the different components of the ORMF of the banks.","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"185 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139190795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Adamaria Perrotta, Andrea Monaco, Georgios Bliatsios
Given the nature of the lending industry and its importance for global economic stability, financial institutions have always been keen on estimating the risk profile of their clients. For this reason, in the last few years several sophisticated techniques for modelling credit risk have been developed and implemented. After the financial crisis of 2007-2008, credit risk management has been further expanded and has acquired significant regulatory importance. Specifically, Basel II and III Accords have strengthened the conditions that banks must fulfil to develop their own internal models for estimating the regulatory capital and expected losses. After motivating the importance of credit risk modelling in the banking sector, in this contribution we perform a review of the traditional statistical methods used for credit risk management. Then we focus on more recent techniques based on Machine Learning techniques, and we critically compare tradition and innovation in credit risk modelling. Finally, we present a case study addressing the main steps to practically develop and validate a Probability of Default model for risk prediction via Machine Learning Techniques
{"title":"Data Analytics for Credit Risk Models in Retail Banking: a new era for the banking system","authors":"Adamaria Perrotta, Andrea Monaco, Georgios Bliatsios","doi":"10.47473/2020rmm0132","DOIUrl":"https://doi.org/10.47473/2020rmm0132","url":null,"abstract":"Given the nature of the lending industry and its importance for global economic stability, financial institutions have always been keen on estimating the risk profile of their clients. For this reason, in the last few years several sophisticated techniques for modelling credit risk have been developed and implemented. After the financial crisis of 2007-2008, credit risk management has been further expanded and has acquired significant regulatory importance. Specifically, Basel II and III Accords have strengthened the conditions that banks must fulfil to develop their own internal models for estimating the regulatory capital and expected losses. After motivating the importance of credit risk modelling in the banking sector, in this contribution we perform a review of the traditional statistical methods used for credit risk management. Then we focus on more recent techniques based on Machine Learning techniques, and we critically compare tradition and innovation in credit risk modelling. Finally, we present a case study addressing the main steps to practically develop and validate a Probability of Default model for risk prediction via Machine Learning Techniques","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"81 1-2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139190861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The correct modeling of the interest rates term structure should definitely be considered an aspect of primary importance since the forward rates and the discount factors used in any financial and risk analysis are calculated from such structure. The turbulence of the markets in recent years, with negative interest rates followed by their recent substantial rise, the period of the COVID pandemic crisis, the political instabilities linked to the war between Ukraine and Russia have very often led to observe anomalies in the shape of the interest rate curve that are difficult to represent using traditional econometric models, to the point that researchers have to address this modeling problem using Machine Learning methodologies. The purpose of this study is to design a model selection heuristic which, starting from the traditional ones (Nelson-Siegel, Svensson and de Rezende-Ferreira) up to the Gaussian Process (GP) Regression, is able to define the best representation for a generic term structure. This approach has been tested over the past five years on term structures denominated in five different currencies: the Swiss Franc (CHF), the Euro (EUR), the British Pound (GBP), the Japanese Yen (JPY) and the U.S. Dollar (USD).
利率期限结构的正确建模无疑应被视为一个至关重要的方面,因为任何金融和风险分析中使用的远期利率和贴现率都是根据这种结构计算出来的。近年来,市场动荡不安,负利率在最近大幅上升,COVID 大流行危机期间,乌克兰和俄罗斯之间的战争导致政治不稳定,这些因素经常导致利率曲线形状出现异常,而传统的计量经济学模型很难表现这种异常,因此研究人员不得不使用机器学习方法来解决这一建模问题。本研究的目的是设计一种模型选择启发式,从传统模型(Nelson-Siegel、Svensson 和 de Rezende-Ferreira)到高斯过程(GP)回归,能够为一般期限结构定义最佳表示方法。在过去五年中,这种方法在以五种不同货币计价的期限结构上进行了测试:瑞士法郎(CHF)、欧元(EUR)、英镑(GBP)、日元(JPY)和美元(USD)。
{"title":"Modeling the interest rates term structure using Machine Learning: a Gaussian process regression approach","authors":"Alessio Delucchi, P. Giribone","doi":"10.47473/2020rmm0131","DOIUrl":"https://doi.org/10.47473/2020rmm0131","url":null,"abstract":"The correct modeling of the interest rates term structure should definitely be considered an aspect of primary importance since the forward rates and the discount factors used in any financial and risk analysis are calculated from such structure. The turbulence of the markets in recent years, with negative interest rates followed by their recent substantial rise, the period of the COVID pandemic crisis, the political instabilities linked to the war between Ukraine and Russia have very often led to observe anomalies in the shape of the interest rate curve that are difficult to represent using traditional econometric models, to the point that researchers have to address this modeling problem using Machine Learning methodologies. The purpose of this study is to design a model selection heuristic which, starting from the traditional ones (Nelson-Siegel, Svensson and de Rezende-Ferreira) up to the Gaussian Process (GP) Regression, is able to define the best representation for a generic term structure. This approach has been tested over the past five years on term structures denominated in five different currencies: the Swiss Franc (CHF), the Euro (EUR), the British Pound (GBP), the Japanese Yen (JPY) and the U.S. Dollar (USD).","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"20 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139192251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Crisis Management and Deposit Insurance Framework - which came into force about ten years ago - is under review by the European Commission. The need for its revision stems from the identification of certain shortcomings and inconsistencies that have emerged in its application in Europe and especially in Italy. The central topics of the debate focus on how resolution should be applied and on possible innovations regarding the tools that can be used to manage the crises of small and medium-sized banks, which until now have been managed on the basis of procedures and tools decided at the national level. The aim of this paper is to investigate the areas subject to reform, using as an evaluation parameter the objective of increasing the flexibility of the framework, as this is considered a fundamental requirement to ensure the full effectiveness of the overall banking crisis management system
{"title":"The revision of the banking crisis management and deposit insurance framework in Europe: Why is it important to enhance flexibility?","authors":"Giuseppe Boccuzzi","doi":"10.47473/2020rmm0127","DOIUrl":"https://doi.org/10.47473/2020rmm0127","url":null,"abstract":"The Crisis Management and Deposit Insurance Framework - which came into force about ten years ago - is under review by the European Commission. The need for its revision stems from the identification of certain shortcomings and inconsistencies that have emerged in its application in Europe and especially in Italy. The central topics of the debate focus on how resolution should be applied and on possible innovations regarding the tools that can be used to manage the crises of small and medium-sized banks, which until now have been managed on the basis of procedures and tools decided at the national level. The aim of this paper is to investigate the areas subject to reform, using as an evaluation parameter the objective of increasing the flexibility of the framework, as this is considered a fundamental requirement to ensure the full effectiveness of the overall banking crisis management system","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117331331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Heston model is one of the most used techniques for estimating the fair value and the risk measures associated with investment certificates. Typically, the pricing engine implements a significant number of projections of the underlying until maturity, it calculates the pay-off for all the paths thus simulated considering the characteristics of the structured product and, in accordance with the Monte Carlo methodology, it determines its theoretical value by calculating its mean and discounting it at valuation time. In order to generate the future paths, the two stochastic differential equations governing the dynamics of the Heston model should be integrated simultaneously over time: both the one directly associated with the underlying and the one associated with variance. Consequently, it is essential to implement a numerical integration scheme that allows such prospective simulations to be implemented. The present study aims to consider alternatives to the traditional Euler method with the aim of reducing or in some cases eliminating the probability of incurring unfeasible simulated values for the variance. In fact, one of the main drawbacks of the Euler basic integration scheme applied to the Heston bivariate stochastic model is that of potentially generating negative variances in the simulation that should be programmatically corrected each time such undesired effect occurs. The methods which do not intrinsically admit the generation of negative values of the variance proved to be very interesting, in particular the Transformed Volatility scheme.
{"title":"Analysis of numerical integration schemes for the Heston model: a case study based on the pricing of investment certificates","authors":"Michelangelo Fusaro, P. Giribone, Alessio Tissone","doi":"10.47473/2020rmm0125","DOIUrl":"https://doi.org/10.47473/2020rmm0125","url":null,"abstract":"The Heston model is one of the most used techniques for estimating the fair value and the risk measures associated with investment certificates. Typically, the pricing engine implements a significant number of projections of the underlying until maturity, it calculates the pay-off for all the paths thus simulated considering the characteristics of the structured product and, in accordance with the Monte Carlo methodology, it determines its theoretical value by calculating its mean and discounting it at valuation time. In order to generate the future paths, the two stochastic differential equations governing the dynamics of the Heston model should be integrated simultaneously over time: both the one directly associated with the underlying and the one associated with variance. Consequently, it is essential to implement a numerical integration scheme that allows such prospective simulations to be implemented. The present study aims to consider alternatives to the traditional Euler method with the aim of reducing or in some cases eliminating the probability of incurring unfeasible simulated values for the variance. In fact, one of the main drawbacks of the Euler basic integration scheme applied to the Heston bivariate stochastic model is that of potentially generating negative variances in the simulation that should be programmatically corrected each time such undesired effect occurs. The methods which do not intrinsically admit the generation of negative values of the variance proved to be very interesting, in particular the Transformed Volatility scheme.","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115182467","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Share buybacks have become a popular way for companies to return capital to shareholders. However, there is an ongoing debate regarding the impact of share buybacks on the performance and shareholder value. This paper starts by examining the literature on share buybacks and aims at testing the signalling hypothesis (ie share buybacks are carried out to signal undervaluation of the stock) on share repurchases performed by banks. More specifically, the analysis conducted measured the impact of share buybacks on banks’ performance as measured by the return on equity (ROE). The results show that there is low significant positive linear relationship between banks’ share buybacks and their ROE.
{"title":"Does the banks’ performance improve after share buybacks?","authors":"M. Brogi, Michelangelo Bruno, Valentina Lagasio","doi":"10.47473/2020rmm0124","DOIUrl":"https://doi.org/10.47473/2020rmm0124","url":null,"abstract":"Share buybacks have become a popular way for companies to return capital to shareholders. However, there is an ongoing debate regarding the impact of share buybacks on the performance and shareholder value. This paper starts by examining the literature on share buybacks and aims at testing the signalling hypothesis (ie share buybacks are carried out to signal undervaluation of the stock) on share repurchases performed by banks. More specifically, the analysis conducted measured the impact of share buybacks on banks’ performance as measured by the return on equity (ROE). The results show that there is low significant positive linear relationship between banks’ share buybacks and their ROE.","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128610049","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aifirm, con l’obiettivo di diffondere cultura e suscitare attenzione sui temi dell’educazione finanziaria, intende proporre materiale didattico ad uso di chiunque abbia buona volontà di porsi come educatore sui temi finanziari. Tale materiale, sottoposto al Consiglio di AIFIRM, è rivolto agli alunni delle classi della scuola primaria, secondaria di primo grado e superiore ed è stato predisposto ed è in corso di predisposizione in coerenza alle linee guida per lo sviluppo delle competenze di educazione finanziaria nella scuola del Comitato per la Programmazione e il Coordinamento delle attività di educazione finanziaria. L’obiettivo principale è quello di porre le basi per costruire le competenze utili ad avere un corretto rapporto con il denaro, un’adeguata percezione e gestione dei rischi e per comprendere come le decisioni collettive abbiano implicazioni economiche per se stessi e per la società a cui si appartiene. Il seguente articolo sull’evoluzione del Risk Management è rivolto agli studenti italiani del triennio della scuola superiore come parte del percorso di Educazione Finanziaria AIFIRM sviluppato nell’ambito della Commissione di ricerca AIFIRM sul tema.
{"title":"L'evoluzione del Risk Management: dal Passato al Presente, un “Pilastro” della Stabilità Finanziaria","authors":"Marilena Cino","doi":"10.47473/2020rmm0128","DOIUrl":"https://doi.org/10.47473/2020rmm0128","url":null,"abstract":"Aifirm, con l’obiettivo di diffondere cultura e suscitare attenzione sui temi dell’educazione finanziaria, intende proporre materiale didattico ad uso di chiunque abbia buona volontà di porsi come educatore sui temi finanziari. Tale materiale, sottoposto al Consiglio di AIFIRM, è rivolto agli alunni delle classi della scuola primaria, secondaria di primo grado e superiore ed è stato predisposto ed è in corso di predisposizione in coerenza alle linee guida per lo sviluppo delle competenze di educazione finanziaria nella scuola del Comitato per la Programmazione e il Coordinamento delle attività di educazione finanziaria. L’obiettivo principale è quello di porre le basi per costruire le competenze utili ad avere un corretto rapporto con il denaro, un’adeguata percezione e gestione dei rischi e per comprendere come le decisioni collettive abbiano implicazioni economiche per se stessi e per la società a cui si appartiene. Il seguente articolo sull’evoluzione del Risk Management è rivolto agli studenti italiani del triennio della scuola superiore come parte del percorso di Educazione Finanziaria AIFIRM sviluppato nell’ambito della Commissione di ricerca AIFIRM sul tema.","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128760396","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The aim of the paper is to explain what is meant by Digital Risk&Governance. For this purpose, it is important to retrace the technological evolution that has affected the last few decades: from branches to Mobile Banking, from the digitalization of transactions to the creation of Fintech, from the first process automations to Artificial Intelligence. This evolutionary journey has not only involved and still involves the birth of new technologies, but also the possibility of seizing new business opportunities and therefore necessarily of facing new types of risk, which are not always intuitive and easy to fully understand and manage. In this context, the role of the Regulator is fundamental not only to make available to companies elements for a correct and complete understanding of Digital/ICT Risk, but also to provide guidelines that allow for the construction of an organizational and governance model suitable for gaining awareness risk and to assess, manage and monitor it. A fundamental role is played by the Digital Operational Resilience Act (DORA), which certainly better defines some aspects that until recently did not find a clear place, but - even more important - which allows these aspects to be included in an organic and holistic framework. Governance and organization are essential in this panorama, the only functions capable of spreading the risk culture necessary to overcome the silo mentality and to establish the cultural paradigm change essential for managing ICT Risk. Given the extension of the perimeter that is generally included under this risk, the paper goes on to underline the most relevant aspects and suggests in a practical way the components on which companies should concentrate in order to implement and make usable an all-round management framework: from the identification of critical functions to the importance of having tools capable of certifying the correctness, completeness and quality of the data. Another high-sounding and closely related theme, which therefore could not fail to be addressed in the paper, is represented by the cyberattack and its impacts on the market. The paper then closes with a theme which, in our opinion, plays an even more stately role than the creation of an overall framework can play: the Digital Strategy, consciously accessible only through a Digital Risk&Governance framework, but which represents the ultimate goal to which companies should aspire.
{"title":"The growing importance of digital risk&governance","authors":"Valerio Begozzi, Matteo Oldani, Francesca Terrizzano","doi":"10.47473/2020rmm0126","DOIUrl":"https://doi.org/10.47473/2020rmm0126","url":null,"abstract":"The aim of the paper is to explain what is meant by Digital Risk&Governance. For this purpose, it is important to retrace the technological evolution that has affected the last few decades: from branches to Mobile Banking, from the digitalization of transactions to the creation of Fintech, from the first process automations to Artificial Intelligence. This evolutionary journey has not only involved and still involves the birth of new technologies, but also the possibility of seizing new business opportunities and therefore necessarily of facing new types of risk, which are not always intuitive and easy to fully understand and manage. In this context, the role of the Regulator is fundamental not only to make available to companies elements for a correct and complete understanding of Digital/ICT Risk, but also to provide guidelines that allow for the construction of an organizational and governance model suitable for gaining awareness risk and to assess, manage and monitor it. A fundamental role is played by the Digital Operational Resilience Act (DORA), which certainly better defines some aspects that until recently did not find a clear place, but - even more important - which allows these aspects to be included in an organic and holistic framework. Governance and organization are essential in this panorama, the only functions capable of spreading the risk culture necessary to overcome the silo mentality and to establish the cultural paradigm change essential for managing ICT Risk. Given the extension of the perimeter that is generally included under this risk, the paper goes on to underline the most relevant aspects and suggests in a practical way the components on which companies should concentrate in order to implement and make usable an all-round management framework: from the identification of critical functions to the importance of having tools capable of certifying the correctness, completeness and quality of the data. Another high-sounding and closely related theme, which therefore could not fail to be addressed in the paper, is represented by the cyberattack and its impacts on the market. The paper then closes with a theme which, in our opinion, plays an even more stately role than the creation of an overall framework can play: the Digital Strategy, consciously accessible only through a Digital Risk&Governance framework, but which represents the ultimate goal to which companies should aspire.","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122549837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Bottasso, Michelangelo Fusaro, P. Giribone, Alessio Tissone
Certificates are structured financial instruments that aim to provide investors with investment solutions tailored to their needs. Certificates can be modeled using a bond component and a derivative component, typically an options strategy. The pricing of certificates is typically performed using the Monte Carlo numerical methodology. Such method allows for projections of the underlying using series of random numbers. The results obtained display an error (standard deviation) that depends on the number of simulations used and on the specific characteristics of the structured product. This work has the objective of minimizing the experimental error, and, consequently, of accelerating the speed of convergence using statistical techniques known in the literature as variance reduction methods. The most popular stochastic dynamics have been analyzed, like the classical Black and Scholes model, the Local Volatility model and the Heston model. Three certificates are analyzed in the paper and they are characterized by different payoffs. The variance reduction techniques, implemented in different programming languages (Python, Matlab and R), are: Latin Hypercube, Stratified Sampling, Antithetic Variables, Importance Sampling, Moment Matching and Control Variates
{"title":"Implementation of variance reduction techniques applied to the pricing of investment certificates","authors":"A. Bottasso, Michelangelo Fusaro, P. Giribone, Alessio Tissone","doi":"10.47473/2020rmm0121","DOIUrl":"https://doi.org/10.47473/2020rmm0121","url":null,"abstract":"Certificates are structured financial instruments that aim to provide investors with investment solutions tailored to their needs. Certificates can be modeled using a bond component and a derivative component, typically an options strategy. The pricing of certificates is typically performed using the Monte Carlo numerical methodology. Such method allows for projections of the underlying using series of random numbers. The results obtained display an error (standard deviation) that depends on the number of simulations used and on the specific characteristics of the structured product. This work has the objective of minimizing the experimental error, and, consequently, of accelerating the speed of convergence using statistical techniques known in the literature as variance reduction methods. The most popular stochastic dynamics have been analyzed, like the classical Black and Scholes model, the Local Volatility model and the Heston model. Three certificates are analyzed in the paper and they are characterized by different payoffs. The variance reduction techniques, implemented in different programming languages (Python, Matlab and R), are: Latin Hypercube, Stratified Sampling, Antithetic Variables, Importance Sampling, Moment Matching and Control Variates","PeriodicalId":296057,"journal":{"name":"Risk Management Magazine","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115792537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}