Pub Date : 2024-09-18DOI: 10.1017/s1748499524000198
Pouya Faroughi, Shu Li, Jiandong Ren
Generalized Poisson (GP) distribution was introduced in Consul & Jain ((1973). Technometrics, 15(4), 791–799.). Since then it has found various applications in actuarial science and other areas. In this paper, we focus on the distributional properties of GP and its related distributions. In particular, we study the distributional properties of distributions in the $mathcal{H}$ family, which includes GP and generalized negative binomial distributions as special cases. We demonstrate that the moment and size-biased transformations of distributions within the $mathcal{H}$ family remain in the same family, which significantly extends the results presented in Ambagaspitiya & Balakrishnan ((1994). ASTINBulletin: the Journal of the IAA, 24(2), 255–263.) and Ambagaspitiya ((1995). Insurance Mathematics and Economics, 2(16), 107–127.). Such findings enable us to provide recursive formulas for evaluating risk measures, such as Value-at-Risk and conditional tail expectation of the compound GP distributions. In addition, we show that the risk measures can be calculated by making use of transform methods, such as fast Fourier transform. In fact, the transformation method showed a remarkable time advantage over the recursive method. We numerically compare the risk measures of the compound sums when the primary distributions are Poisson and GP. The results illustrate the model risk for the loss frequency distribution.
广义泊松(GP)分布在 Consul & Jain((1973).Technometrics,15(4),791-799)。此后,它在精算学和其他领域得到了广泛应用。本文重点研究 GP 及其相关分布的分布特性。特别是,我们研究了 $mathcal{H}$ 系列分布的分布性质,其中 GP 和广义负二项分布是特例。我们证明了 $mathcal{H}$ 族中分布的矩和大小偏置变换仍在同一族中,这大大扩展了 Ambagaspitiya & Balakrishnan((1994).ASTINBulletin: the Journal of the IAA, 24(2), 255-263.) 和 Ambagaspitiya ((1995).保险数学与经济学》,2(16),107-127)。这些发现使我们能够提供评估风险度量的递归公式,如风险价值和复合 GP 分布的条件尾期望。此外,我们还展示了利用快速傅立叶变换等变换方法可以计算风险度量。事实上,与递归方法相比,变换方法具有显著的时间优势。我们对主分布为泊松和 GP 时复合和的风险度量进行了数值比较。结果说明了损失频率分布的模型风险。
{"title":"Generalized Poisson random variable: its distributional properties and actuarial applications","authors":"Pouya Faroughi, Shu Li, Jiandong Ren","doi":"10.1017/s1748499524000198","DOIUrl":"https://doi.org/10.1017/s1748499524000198","url":null,"abstract":"Generalized Poisson (GP) distribution was introduced in Consul & Jain ((1973). <jats:italic>Technometrics</jats:italic>, 15(4), 791–799.). Since then it has found various applications in actuarial science and other areas. In this paper, we focus on the distributional properties of GP and its related distributions. In particular, we study the distributional properties of distributions in the <jats:inline-formula> <jats:alternatives> <jats:inline-graphic xmlns:xlink=\"http://www.w3.org/1999/xlink\" mime-subtype=\"png\" xlink:href=\"S1748499524000198_inline1.png\"/> <jats:tex-math> $mathcal{H}$ </jats:tex-math> </jats:alternatives> </jats:inline-formula> family, which includes GP and generalized negative binomial distributions as special cases. We demonstrate that the moment and size-biased transformations of distributions within the <jats:inline-formula> <jats:alternatives> <jats:inline-graphic xmlns:xlink=\"http://www.w3.org/1999/xlink\" mime-subtype=\"png\" xlink:href=\"S1748499524000198_inline2.png\"/> <jats:tex-math> $mathcal{H}$ </jats:tex-math> </jats:alternatives> </jats:inline-formula> family remain in the same family, which significantly extends the results presented in Ambagaspitiya & Balakrishnan ((1994). <jats:italic>ASTINBulletin: the Journal of the IAA</jats:italic>, 24(2), 255–263.) and Ambagaspitiya ((1995). <jats:italic>Insurance Mathematics and Economics</jats:italic>, 2(16), 107–127.). Such findings enable us to provide recursive formulas for evaluating risk measures, such as Value-at-Risk and conditional tail expectation of the compound GP distributions. In addition, we show that the risk measures can be calculated by making use of transform methods, such as fast Fourier transform. In fact, the transformation method showed a remarkable time advantage over the recursive method. We numerically compare the risk measures of the compound sums when the primary distributions are Poisson and GP. The results illustrate the model risk for the loss frequency distribution.","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"37 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142250381","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-31DOI: 10.1017/s1748499524000162
Zinoviy Landsman, Tomer Shushi
Risk measurement and econometrics are the two pillars of actuarial science. Unlike econometrics, risk measurement allows taking into account decision-makers’ risk aversion when analyzing the risks. We propose a hybrid model that captures decision-makers’ regression-based approach to study risks, focusing on explanatory variables while paying attention to risk severity. Our model considers different loss functions that quantify the severity of the losses that are provided by the risk manager or the actuary. We present an explicit formula for the regression estimators for the proposed risk-based regression problem and study the proposed results. Finally, we provide a numerical study of the results using data from the insurance industry.
{"title":"Optimizing insurance risk assessment: a regression model based on a risk-loaded approach","authors":"Zinoviy Landsman, Tomer Shushi","doi":"10.1017/s1748499524000162","DOIUrl":"https://doi.org/10.1017/s1748499524000162","url":null,"abstract":"Risk measurement and econometrics are the two pillars of actuarial science. Unlike econometrics, risk measurement allows taking into account decision-makers’ risk aversion when analyzing the risks. We propose a hybrid model that captures decision-makers’ regression-based approach to study risks, focusing on explanatory variables while paying attention to risk severity. Our model considers different loss functions that quantify the severity of the losses that are provided by the risk manager or the actuary. We present an explicit formula for the regression estimators for the proposed risk-based regression problem and study the proposed results. Finally, we provide a numerical study of the results using data from the insurance industry.","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"455 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141195948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-21DOI: 10.1017/s1748499524000113
Jean-Philippe Boucher, Raïssa Coulibaly
Based on the recent papers, two distributions for the total claims amount (loss cost) are considered: compound Poisson-gamma and Tweedie. Each is used as an underlying distribution in the Bonus-Malus Scale (BMS) model. The BMS model links the premium of an insurance contract to a function of the insurance experience of the related policy. In other words, the idea is to model the increase and the decrease in premiums for insureds who do or do not file claims. We applied our approach to a sample of data from a major insurance company in Canada. Data fit and predictability were analyzed. We showed that the studied models are exciting alternatives to consider from a practical point of view, and that predictive ratemaking models can address some important practical considerations.
{"title":"Bonus-Malus Scale premiums for Tweedie’s compound Poisson models","authors":"Jean-Philippe Boucher, Raïssa Coulibaly","doi":"10.1017/s1748499524000113","DOIUrl":"https://doi.org/10.1017/s1748499524000113","url":null,"abstract":"Based on the recent papers, two distributions for the total claims amount (loss cost) are considered: compound Poisson-gamma and Tweedie. Each is used as an underlying distribution in the Bonus-Malus Scale (BMS) model. The BMS model links the premium of an insurance contract to a function of the insurance experience of the related policy. In other words, the idea is to model the increase and the decrease in premiums for insureds who do or do not file claims. We applied our approach to a sample of data from a major insurance company in Canada. Data fit and predictability were analyzed. We showed that the studied models are exciting alternatives to consider from a practical point of view, and that predictive ratemaking models can address some important practical considerations.","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"142 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141147997","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-14DOI: 10.1017/s1748499524000137
Dechen Gao, Jiandong Ren
This paper studies a hierarchical risk model where an accident can cause a combination of different types of claims, whose sizes could be dependent. In addition, the frequencies of accidents that cause the different combinations of claims are dependent. We first derive formulas for computing risk measures, such as the Tail Conditional Expectation and Tail Variance of the aggregate losses for a portfolio of businesses. Then, we present formulas for performing the associated capital allocation to different types of claims in the portfolio. The main tool we used is the moment (or size-biased) transform of the multivariate distributions.
{"title":"Risk analysis of a multivariate aggregate loss model with dependence","authors":"Dechen Gao, Jiandong Ren","doi":"10.1017/s1748499524000137","DOIUrl":"https://doi.org/10.1017/s1748499524000137","url":null,"abstract":"<p>This paper studies a hierarchical risk model where an accident can cause a combination of different types of claims, whose sizes could be dependent. In addition, the frequencies of accidents that cause the different combinations of claims are dependent. We first derive formulas for computing risk measures, such as the Tail Conditional Expectation and Tail Variance of the aggregate losses for a portfolio of businesses. Then, we present formulas for performing the associated capital allocation to different types of claims in the portfolio. The main tool we used is the moment (or size-biased) transform of the multivariate distributions.</p>","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"165 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140930889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-13DOI: 10.1017/s1748499524000101
Cole van Jaarsveldt, Gareth W. Peters, Matthew Ames, Mike Chantler
This paper will outline the functionality available in the <jats:sans-serif>CovRegpy</jats:sans-serif> package which was written for actuarial practitioners, wealth managers, fund managers, and portfolio analysts in the language of <jats:monospace>Python 3.11</jats:monospace>. The objective is to develop a new class of covariance regression factor models for covariance forecasting, along with a library of portfolio allocation tools that integrate with this new covariance forecasting framework. The novelty is in two stages: the type of covariance regression model and factor extractions used to construct the covariates used in the covariance regression, along with a powerful portfolio allocation framework for dynamic multi-period asset investment management. The major contributions of package <jats:sans-serif>CovRegpy</jats:sans-serif> can be found on the GitHub repository for this library in the scripts: <jats:monospace>CovRegpy.py</jats:monospace>, <jats:monospace>CovRegpy_DCC.py</jats:monospace>, <jats:monospace>CovRegpy_RPP.py</jats:monospace>, <jats:monospace>CovRegpy_SSA.py</jats:monospace>, <jats:monospace>CovRegpy_SSD.py</jats:monospace>, and <jats:monospace>CovRegpy_X11.py</jats:monospace>. These six scripts contain implementations of software features including multivariate covariance time series models based on the regularized covariance regression (RCR) framework, dynamic conditional correlation (DCC) framework, risk premia parity (RPP) weighting functions, singular spectrum analysis (SSA), singular spectrum decomposition (SSD), and X11 decomposition framework, respectively. These techniques can be used sequentially or independently with other techniques to extract implicit factors to use them as covariates in the RCR framework to forecast covariance and correlation structures and finally apply portfolio weighting strategies based on the portfolio risk measures based on forecasted covariance assumptions. Explicit financial factors can be used in the covariance regression framework, implicit factors can be used in the traditional explicit market factor setting, and RPP techniques with long/short equity weighting strategies can be used in traditional covariance assumption frameworks. We examine, herein, two real-world case studies for actuarial practitioners. The first of these is a modification (demonstrating the regularization of covariance regression) of the original example from Hoff & Niu ((2012). <jats:italic>Statistica Sinica</jats:italic>, 22(2), 729–753) which modeled the covariance and correlative relationship that exists between forced expiratory volume (FEV) and age and FEV and height. We examine this within the context of making probabilistic predictions about mortality rates in patients with chronic obstructive pulmonary disease. The second case study is a more complete example using this package wherein we present a funded and unfunded UK pension example. The decomposition algorithm isolates high-, mid-, and low-frequen
{"title":"Package CovRegpy: Regularized covariance regression and forecasting in Python","authors":"Cole van Jaarsveldt, Gareth W. Peters, Matthew Ames, Mike Chantler","doi":"10.1017/s1748499524000101","DOIUrl":"https://doi.org/10.1017/s1748499524000101","url":null,"abstract":"This paper will outline the functionality available in the <jats:sans-serif>CovRegpy</jats:sans-serif> package which was written for actuarial practitioners, wealth managers, fund managers, and portfolio analysts in the language of <jats:monospace>Python 3.11</jats:monospace>. The objective is to develop a new class of covariance regression factor models for covariance forecasting, along with a library of portfolio allocation tools that integrate with this new covariance forecasting framework. The novelty is in two stages: the type of covariance regression model and factor extractions used to construct the covariates used in the covariance regression, along with a powerful portfolio allocation framework for dynamic multi-period asset investment management. The major contributions of package <jats:sans-serif>CovRegpy</jats:sans-serif> can be found on the GitHub repository for this library in the scripts: <jats:monospace>CovRegpy.py</jats:monospace>, <jats:monospace>CovRegpy_DCC.py</jats:monospace>, <jats:monospace>CovRegpy_RPP.py</jats:monospace>, <jats:monospace>CovRegpy_SSA.py</jats:monospace>, <jats:monospace>CovRegpy_SSD.py</jats:monospace>, and <jats:monospace>CovRegpy_X11.py</jats:monospace>. These six scripts contain implementations of software features including multivariate covariance time series models based on the regularized covariance regression (RCR) framework, dynamic conditional correlation (DCC) framework, risk premia parity (RPP) weighting functions, singular spectrum analysis (SSA), singular spectrum decomposition (SSD), and X11 decomposition framework, respectively. These techniques can be used sequentially or independently with other techniques to extract implicit factors to use them as covariates in the RCR framework to forecast covariance and correlation structures and finally apply portfolio weighting strategies based on the portfolio risk measures based on forecasted covariance assumptions. Explicit financial factors can be used in the covariance regression framework, implicit factors can be used in the traditional explicit market factor setting, and RPP techniques with long/short equity weighting strategies can be used in traditional covariance assumption frameworks. We examine, herein, two real-world case studies for actuarial practitioners. The first of these is a modification (demonstrating the regularization of covariance regression) of the original example from Hoff & Niu ((2012). <jats:italic>Statistica Sinica</jats:italic>, 22(2), 729–753) which modeled the covariance and correlative relationship that exists between forced expiratory volume (FEV) and age and FEV and height. We examine this within the context of making probabilistic predictions about mortality rates in patients with chronic obstructive pulmonary disease. The second case study is a more complete example using this package wherein we present a funded and unfunded UK pension example. The decomposition algorithm isolates high-, mid-, and low-frequen","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"42 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140930684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-13DOI: 10.1017/s1748499524000046
Pasin Marupanthorn, Gareth W. Peters, Eric D. Ofosu-Hene, Christina S. Nikitopoulos, Kylie-Anne Richards
This paper introduces DivFolio, a multiperiod portfolio selection and analytic software application that incorporates automated and user-determined divestment practices accommodating Environmental Social Governance (ESG) and portfolio carbon footprint considerations. This freely available portfolio analytics software tool is written in R with a GUI interface developed as an R Shiny application for ease of user experience. Users can utilize this software to dynamically assess the performance of asset selections from global equity, exchange-traded funds, exchange-traded notes, and depositary receipts markets over multiple time periods. This assessment is based on the impact of ESG investment and fossil-fuel divestment practices on portfolio behavior in terms of risk, return, stability, diversification, and climate mitigation credentials of associated investment decisions. We highlight two applications of DivFolio. The first revolves around using sector scanning to divest from a specialized portfolio featuring constituents of the FTSE 100. The second, rooted in actuarial considerations, focuses on divestment strategies informed by environmental risk assessments for mixed pension portfolios in the US and UK.
本文介绍的 DivFolio 是一款多期投资组合选择和分析软件应用程序,它结合了自动和用户自定的撤资实践,并考虑了环境社会治理(ESG)和投资组合碳足迹因素。这个免费提供的投资组合分析软件工具是用 R 语言编写的,其图形用户界面是作为 R Shiny 应用程序开发的,以方便用户体验。用户可以利用该软件动态评估全球股票、交易所交易基金、交易所交易票据和存托凭证市场中的资产选择在多个时间段内的表现。该评估基于环境、社会和公司治理投资及化石燃料撤资实践对投资组合行为的影响,包括相关投资决策的风险、回报、稳定性、多样化和气候减缓信用。我们重点介绍 DivFolio 的两个应用。第一个应用是利用行业扫描从一个专门的投资组合中撤资,该投资组合以富时 100 指数成分股为特色。第二项应用源于精算方面的考虑,重点关注美国和英国混合养老金投资组合的环境风险评估所提供的撤资策略。
{"title":"DivFolio: a Shiny application for portfolio divestment in green finance wealth management","authors":"Pasin Marupanthorn, Gareth W. Peters, Eric D. Ofosu-Hene, Christina S. Nikitopoulos, Kylie-Anne Richards","doi":"10.1017/s1748499524000046","DOIUrl":"https://doi.org/10.1017/s1748499524000046","url":null,"abstract":"This paper introduces <jats:italic>DivFolio</jats:italic>, a multiperiod portfolio selection and analytic software application that incorporates automated and user-determined divestment practices accommodating Environmental Social Governance (ESG) and portfolio carbon footprint considerations. This freely available portfolio analytics software tool is written in R with a GUI interface developed as an R Shiny application for ease of user experience. Users can utilize this software to dynamically assess the performance of asset selections from global equity, exchange-traded funds, exchange-traded notes, and depositary receipts markets over multiple time periods. This assessment is based on the impact of ESG investment and fossil-fuel divestment practices on portfolio behavior in terms of risk, return, stability, diversification, and climate mitigation credentials of associated investment decisions. We highlight two applications of <jats:italic>DivFolio</jats:italic>. The first revolves around using sector scanning to divest from a specialized portfolio featuring constituents of the FTSE 100. The second, rooted in actuarial considerations, focuses on divestment strategies informed by environmental risk assessments for mixed pension portfolios in the US and UK.","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"129 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140930888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-13DOI: 10.1017/s1748499524000095
Donatien Hainaut
Guaranteed minimum accumulation benefits (GMABs) are retirement savings vehicles that protect the policyholder against downside market risk. This article proposes a valuation method for these contracts based on physics-inspired neural networks (PINNs), in the presence of multiple financial and biometric risk factors. A PINN integrates principles from physics into its learning process to enhance its efficiency in solving complex problems. In this article, the driving principle is the Feynman–Kac (FK) equation, which is a partial differential equation (PDE) governing the GMAB price in an arbitrage-free market. In our context, the FK PDE depends on multiple variables and is difficult to solve using classical finite difference approximations. In comparison, PINNs constitute an efficient alternative that can evaluate GMABs with various specifications without the need for retraining. To illustrate this, we consider a market with four risk factors. We first derive a closed-form expression for the GMAB that serves as a benchmark for the PINN. Next, we propose a scaled version of the FK equation that we solve using a PINN. Pricing errors are analyzed in a numerical illustration.
{"title":"Valuation of guaranteed minimum accumulation benefits (GMABs) with physics-inspired neural networks","authors":"Donatien Hainaut","doi":"10.1017/s1748499524000095","DOIUrl":"https://doi.org/10.1017/s1748499524000095","url":null,"abstract":"Guaranteed minimum accumulation benefits (GMABs) are retirement savings vehicles that protect the policyholder against downside market risk. This article proposes a valuation method for these contracts based on physics-inspired neural networks (PINNs), in the presence of multiple financial and biometric risk factors. A PINN integrates principles from physics into its learning process to enhance its efficiency in solving complex problems. In this article, the driving principle is the Feynman–Kac (FK) equation, which is a partial differential equation (PDE) governing the GMAB price in an arbitrage-free market. In our context, the FK PDE depends on multiple variables and is difficult to solve using classical finite difference approximations. In comparison, PINNs constitute an efficient alternative that can evaluate GMABs with various specifications without the need for retraining. To illustrate this, we consider a market with four risk factors. We first derive a closed-form expression for the GMAB that serves as a benchmark for the PINN. Next, we propose a scaled version of the FK equation that we solve using a PINN. Pricing errors are analyzed in a numerical illustration.","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"43 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140930680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-22DOI: 10.1017/s1748499524000083
Susanna Levantesi, Massimiliano Menzietti, Anna Kamille Nyegaard
The calculation of life and health insurance liabilities is based on assumptions about mortality and disability rates, and insurance companies face systematic insurance risks if assumptions about these rates change. In this paper, we study how to manage systematic insurance risks in a multi-state setup by considering securities linked to the transition intensities of the model. We assume there exists a market for trading two securities linked to, for instance, mortality and disability rates, the de-risking option and the de-risking swap, and we describe the optimization problem to find the de-risking strategy that minimizes systematic insurance risks in a multi-state setup. We develop a numerical example based on the disability model, and the results imply that systematic insurance risks significantly decrease when implementing de-risking strategies.
{"title":"De-risking in multi-state life and health insurance","authors":"Susanna Levantesi, Massimiliano Menzietti, Anna Kamille Nyegaard","doi":"10.1017/s1748499524000083","DOIUrl":"https://doi.org/10.1017/s1748499524000083","url":null,"abstract":"The calculation of life and health insurance liabilities is based on assumptions about mortality and disability rates, and insurance companies face systematic insurance risks if assumptions about these rates change. In this paper, we study how to manage systematic insurance risks in a multi-state setup by considering securities linked to the transition intensities of the model. We assume there exists a market for trading two securities linked to, for instance, mortality and disability rates, the de-risking option and the de-risking swap, and we describe the optimization problem to find the de-risking strategy that minimizes systematic insurance risks in a multi-state setup. We develop a numerical example based on the disability model, and the results imply that systematic insurance risks significantly decrease when implementing de-risking strategies.","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"26 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140635614","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-01DOI: 10.1017/s174849952400006x
Ronald Richman, Mario V. Wüthrich
Deep neural networks have become an important tool for use in actuarial tasks, due to the significant gains in accuracy provided by these techniques compared to traditional methods, but also due to the close connection of these models to the generalized linear models (GLMs) currently used in industry. Although constraining GLM parameters relating to insurance risk factors to be smooth or exhibit monotonicity is trivial, methods to incorporate such constraints into deep neural networks have not yet been developed. This is a barrier for the adoption of neural networks in insurance practice since actuaries often impose these constraints for commercial or statistical reasons. In this work, we present a novel method for enforcing constraints within deep neural network models, and we show how these models can be trained. Moreover, we provide example applications using real-world datasets. We call our proposed method ICEnet to emphasize the close link of our proposal to the individual conditional expectation model interpretability technique.
{"title":"Smoothness and monotonicity constraints for neural networks using ICEnet","authors":"Ronald Richman, Mario V. Wüthrich","doi":"10.1017/s174849952400006x","DOIUrl":"https://doi.org/10.1017/s174849952400006x","url":null,"abstract":"<p>Deep neural networks have become an important tool for use in actuarial tasks, due to the significant gains in accuracy provided by these techniques compared to traditional methods, but also due to the close connection of these models to the generalized linear models (GLMs) currently used in industry. Although constraining GLM parameters relating to insurance risk factors to be smooth or exhibit monotonicity is trivial, methods to incorporate such constraints into deep neural networks have not yet been developed. This is a barrier for the adoption of neural networks in insurance practice since actuaries often impose these constraints for commercial or statistical reasons. In this work, we present a novel method for enforcing constraints within deep neural network models, and we show how these models can be trained. Moreover, we provide example applications using real-world datasets. We call our proposed method <span>ICEnet</span> to emphasize the close link of our proposal to the individual conditional expectation model interpretability technique.</p>","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"152 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140574264","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-26DOI: 10.1017/s1748499524000058
Alex Jose, Angus S. Macdonald, George Tzougas, George Streftaris
In this paper, we construct interpretable zero-inflated neural network models for modeling hospital admission counts related to respiratory diseases among a health-insured population and their dependants in the United States. In particular, we exemplify our approach by considering the zero-inflated Poisson neural network (ZIPNN), and we follow the combined actuarial neural network (CANN) approach for developing zero-inflated combined actuarial neural network (ZIPCANN) models for modeling admission rates, which can accommodate the excess zero nature of admission counts data. Furthermore, we adopt the LocalGLMnet approach (Richman & Wüthrich (2023). Scandinavian Actuarial Journal, 2023(1), 71–95.) for interpreting the ZIPNN model results. This facilitates the analysis of the impact of a number of socio-demographic factors on the admission rates related to respiratory disease while benefiting from an improved predictive performance. The real-life utility of the methodologies developed as part of this work lies in the fact that they facilitate accurate rate setting, in addition to offering the potential to inform health interventions.
{"title":"Interpretable zero-inflated neural network models for predicting admission counts","authors":"Alex Jose, Angus S. Macdonald, George Tzougas, George Streftaris","doi":"10.1017/s1748499524000058","DOIUrl":"https://doi.org/10.1017/s1748499524000058","url":null,"abstract":"<p>In this paper, we construct interpretable zero-inflated neural network models for modeling hospital admission counts related to respiratory diseases among a health-insured population and their dependants in the United States. In particular, we exemplify our approach by considering the zero-inflated Poisson neural network (ZIPNN), and we follow the combined actuarial neural network (CANN) approach for developing zero-inflated combined actuarial neural network (ZIPCANN) models for modeling admission rates, which can accommodate the excess zero nature of admission counts data. Furthermore, we adopt the LocalGLMnet approach (Richman & Wüthrich (2023). <span>Scandinavian Actuarial Journal</span>, 2023(1), 71–95.) for interpreting the ZIPNN model results. This facilitates the analysis of the impact of a number of socio-demographic factors on the admission rates related to respiratory disease while benefiting from an improved predictive performance. The real-life utility of the methodologies developed as part of this work lies in the fact that they facilitate accurate rate setting, in addition to offering the potential to inform health interventions.</p>","PeriodicalId":44135,"journal":{"name":"Annals of Actuarial Science","volume":"234 1","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140301621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}