Pub Date : 2021-10-11DOI: 10.1080/10920277.2023.2167833
Martin Bladt
Phase-type (PH) distributions are a popular tool for the analysis of univariate risks in numerous actuarial applications. Their multivariate counterparts (MPH$^ast$), however, have not seen such a proliferation, due to lack of explicit formulas and complicated estimation procedures. A simple construction of multivariate phase-type distributions -- mPH -- is proposed for the parametric description of multivariate risks, leading to models of considerable probabilistic flexibility and statistical tractability. The main idea is to start different Markov processes at the same state, and allow them to evolve independently thereafter, leading to dependent absorption times. By dimension augmentation arguments, this construction can be cast into the umbrella of MPH$^ast$ class, but enjoys explicit formulas which the general specification lacks, including common measures of dependence. Moreover, it is shown that the class is still rich enough to be dense on the set of multivariate risks supported on the positive orthant, and it is the smallest known sub-class to have this property. In particular, the latter result provides a new short proof of the denseness of the MPH$^ast$ class. In practice this means that the mPH class allows for modeling of bivariate risks with any given correlation or copula. We derive an EM algorithm for its statistical estimation, and illustrate it on bivariate insurance data. Extensions to more general settings are outlined.
{"title":"A Tractable Class of Multivariate Phase-Type Distributions for Loss Modeling","authors":"Martin Bladt","doi":"10.1080/10920277.2023.2167833","DOIUrl":"https://doi.org/10.1080/10920277.2023.2167833","url":null,"abstract":"Phase-type (PH) distributions are a popular tool for the analysis of univariate risks in numerous actuarial applications. Their multivariate counterparts (MPH$^ast$), however, have not seen such a proliferation, due to lack of explicit formulas and complicated estimation procedures. A simple construction of multivariate phase-type distributions -- mPH -- is proposed for the parametric description of multivariate risks, leading to models of considerable probabilistic flexibility and statistical tractability. The main idea is to start different Markov processes at the same state, and allow them to evolve independently thereafter, leading to dependent absorption times. By dimension augmentation arguments, this construction can be cast into the umbrella of MPH$^ast$ class, but enjoys explicit formulas which the general specification lacks, including common measures of dependence. Moreover, it is shown that the class is still rich enough to be dense on the set of multivariate risks supported on the positive orthant, and it is the smallest known sub-class to have this property. In particular, the latter result provides a new short proof of the denseness of the MPH$^ast$ class. In practice this means that the mPH class allows for modeling of bivariate risks with any given correlation or copula. We derive an EM algorithm for its statistical estimation, and illustrate it on bivariate insurance data. Extensions to more general settings are outlined.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43493969","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-17DOI: 10.1080/10920277.2021.1977144
Cuixia Chen, Yi-Jia Lin, Minghe Zhou
This study investigates risk-seeking and optimal decisions of annuity providers. On the basis of a sample of U.S. life and annuity (L/A) insurers between 1997 and 2016, the results show clear performance-dependent risk attitudes. Specifically, insurers with returns below aspiration levels take more risks, whereas those with returns above reference levels decrease their risk-seeking, which supports the basic propositions of the cumulative prospect theory (CPT). Given initial evidence of mixed risk preferences in the L/A insurance industry, we derive an annuity insurer’s optimal investment and business strategies in a CPT decision-making framework. We show that changing risk preferences considerably affect an annuity provider’s decisions. We further illustrate how risk management changes an annuity insurer’s optimal strategies. Our results suggest that risk management lowers downside risk and allows a loss-averse decision maker to assume more risk and achieve a higher level of utility.
{"title":"Risk-Seeking Behavior and Its Implications for the Optimal Decision Making of Annuity Insurers","authors":"Cuixia Chen, Yi-Jia Lin, Minghe Zhou","doi":"10.1080/10920277.2021.1977144","DOIUrl":"https://doi.org/10.1080/10920277.2021.1977144","url":null,"abstract":"This study investigates risk-seeking and optimal decisions of annuity providers. On the basis of a sample of U.S. life and annuity (L/A) insurers between 1997 and 2016, the results show clear performance-dependent risk attitudes. Specifically, insurers with returns below aspiration levels take more risks, whereas those with returns above reference levels decrease their risk-seeking, which supports the basic propositions of the cumulative prospect theory (CPT). Given initial evidence of mixed risk preferences in the L/A insurance industry, we derive an annuity insurer’s optimal investment and business strategies in a CPT decision-making framework. We show that changing risk preferences considerably affect an annuity provider’s decisions. We further illustrate how risk management changes an annuity insurer’s optimal strategies. Our results suggest that risk management lowers downside risk and allows a loss-averse decision maker to assume more risk and achieve a higher level of utility.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45228951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-17DOI: 10.1080/10920277.2021.1948430
T. Siu, H. Nguyen, Ning Wang
This article aims to investigate, from an academic perspective, a potential application of dynamic fund protection to protect a mortgagor of a property against the downside risk due to falling property price. The valuation of the dynamic fund protection is discussed through modeling the property price and interest rate, which may be considered to be two key factors having a material impact on the mortgagor. Specifically, a mean-reverting process is used to describe the property price and the Heath-Jarrow-Morton theory is used to model the interest rate. The valuation is done via the use of a forward measure approach. The numerical solution to the pricing partial differential equation is obtained via applying the finite difference method. Numerical results with some model parameters being estimated from the data on an Australian residential property index and Australian zero-coupon yields and forward rates are provided. The implications of the numerical results for the potential implementation of the dynamic fund protection are discussed.
{"title":"Dynamic Fund Protection for Property Markets","authors":"T. Siu, H. Nguyen, Ning Wang","doi":"10.1080/10920277.2021.1948430","DOIUrl":"https://doi.org/10.1080/10920277.2021.1948430","url":null,"abstract":"This article aims to investigate, from an academic perspective, a potential application of dynamic fund protection to protect a mortgagor of a property against the downside risk due to falling property price. The valuation of the dynamic fund protection is discussed through modeling the property price and interest rate, which may be considered to be two key factors having a material impact on the mortgagor. Specifically, a mean-reverting process is used to describe the property price and the Heath-Jarrow-Morton theory is used to model the interest rate. The valuation is done via the use of a forward measure approach. The numerical solution to the pricing partial differential equation is obtained via applying the finite difference method. Numerical results with some model parameters being estimated from the data on an Australian residential property index and Australian zero-coupon yields and forward rates are provided. The implications of the numerical results for the potential implementation of the dynamic fund protection are discussed.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47705780","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-10DOI: 10.1080/10920277.2021.1966805
Yichun Chi, Z. Xu, S. Zhuang
In this article, we examine the effect of background risk on portfolio selection and optimal reinsurance design under the criterion of maximizing the probability of reaching a goal. Following the literature, we adopt dependence uncertainty to model the dependence ambiguity between financial risk (or insurable risk) and background risk. Because the goal-reaching objective function is nonconcave, these two problems bring highly unconventional and challenging issues for which classical optimization techniques often fail. Using a quantile formulation method, we derive the optimal solutions explicitly. The results show that the presence of background risk does not alter the shape of the solution but instead changes the parameter value of the solution. Finally, numerical examples are given to illustrate the results and verify the robustness of our solutions.
{"title":"Distributionally Robust Goal-Reaching Optimization in the Presence of Background Risk","authors":"Yichun Chi, Z. Xu, S. Zhuang","doi":"10.1080/10920277.2021.1966805","DOIUrl":"https://doi.org/10.1080/10920277.2021.1966805","url":null,"abstract":"In this article, we examine the effect of background risk on portfolio selection and optimal reinsurance design under the criterion of maximizing the probability of reaching a goal. Following the literature, we adopt dependence uncertainty to model the dependence ambiguity between financial risk (or insurable risk) and background risk. Because the goal-reaching objective function is nonconcave, these two problems bring highly unconventional and challenging issues for which classical optimization techniques often fail. Using a quantile formulation method, we derive the optimal solutions explicitly. The results show that the presence of background risk does not alter the shape of the solution but instead changes the parameter value of the solution. Finally, numerical examples are given to illustrate the results and verify the robustness of our solutions.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-08-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45376793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-06DOI: 10.1080/10920277.2021.1946660
S. Gutterman
Despite some favorable global trends in the prevalence of heavy episodic drinking, alcohol-related mortality and morbidity since 2010, and the prevalence of youth drinking in certain developed countries, there has been limited progress in reducing total per-capita alcohol consumption. The burden of disease attributable to alcohol remains high, particularly at pre-retirement ages, and is increasing in some countries and for some causes of death. This article describes the status and trends in alcohol consumption, both worldwide and in the United States. It also describes the adverse consequences of heavy and binge drinking, which are significant to the individual, family and friends, and society. Although the overall effect on mortality of moderate alcohol drinking compared with no drinking at all has generally been viewed to be somewhat favorable due to the effect of certain cardiovascular risks, this view is not shared by all—the arguments involved are examined in this article. The recognition and need for active management of the adverse effects of heavy and binge alcohol consumption, remain essential to favorable health and longevity. Possible public interventions are also described. Actuaries involved in assessing mortality trends and product design need to assess trends in drivers and consequences of historical, current, and expected future alcohol-attributable mortality and morbidity patterns on a regular basis.
{"title":"Alcohol and Mortality: An Actuarial Issue","authors":"S. Gutterman","doi":"10.1080/10920277.2021.1946660","DOIUrl":"https://doi.org/10.1080/10920277.2021.1946660","url":null,"abstract":"Despite some favorable global trends in the prevalence of heavy episodic drinking, alcohol-related mortality and morbidity since 2010, and the prevalence of youth drinking in certain developed countries, there has been limited progress in reducing total per-capita alcohol consumption. The burden of disease attributable to alcohol remains high, particularly at pre-retirement ages, and is increasing in some countries and for some causes of death. This article describes the status and trends in alcohol consumption, both worldwide and in the United States. It also describes the adverse consequences of heavy and binge drinking, which are significant to the individual, family and friends, and society. Although the overall effect on mortality of moderate alcohol drinking compared with no drinking at all has generally been viewed to be somewhat favorable due to the effect of certain cardiovascular risks, this view is not shared by all—the arguments involved are examined in this article. The recognition and need for active management of the adverse effects of heavy and binge alcohol consumption, remain essential to favorable health and longevity. Possible public interventions are also described. Actuaries involved in assessing mortality trends and product design need to assess trends in drivers and consequences of historical, current, and expected future alcohol-attributable mortality and morbidity patterns on a regular basis.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10920277.2021.1946660","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45985356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this article, we study the moment transform of both univariate and multivariate compound sums. We first derive simple explicit formulas for the first and second moment transforms when the (loss) frequency distribution is in the so-called class. Then we show that the derived formulas can be used to efficiently compute risk measures such as the tail conditional expectation (TCE), the tail variance (TV), and higher tail moments. The results generalize those in Denuit (North American Actuarial Journal, 24 (4):512–32, 2020).
{"title":"Tail Moments of Compound Distributions","authors":"Jiandong Ren","doi":"10.2139/ssrn.3880127","DOIUrl":"https://doi.org/10.2139/ssrn.3880127","url":null,"abstract":"In this article, we study the moment transform of both univariate and multivariate compound sums. We first derive simple explicit formulas for the first and second moment transforms when the (loss) frequency distribution is in the so-called class. Then we show that the derived formulas can be used to efficiently compute risk measures such as the tail conditional expectation (TCE), the tail variance (TV), and higher tail moments. The results generalize those in Denuit (North American Actuarial Journal, 24 (4):512–32, 2020).","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44835084","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-06-30DOI: 10.1080/10920277.2021.1919145
Kwangmin Jung
This study proposes a measure of the data breach risk’s probable maximum loss, which stands for the worst data breach loss likely to occur, using an alternative approach to estimating the potential loss degree of an extreme event with one of the largest private databases for data breach risk. We determine stationarity, the presence of autoregressive feature, and the Fréchet type of generalized extreme value distribution (GEV) as the best fit for data breach loss maxima series and check robustness of the model with a public dataset. We find that the predicted data breach loss likely to occur in the next five years is substantially larger than the loss estimated by the recent literature with a Pareto model. In particular, the comparison between the estimates from the recent data (after 2014) and those for the old data (before 2014) shows a significant increase with a break in the loss severity. We design a three-layer reinsurance scheme based on the probable maximum loss estimates with public–private partnership. Our findings are important for risk managers, actuaries, and policymakers concerned about the enormous cost of the next extreme cyber event.
{"title":"Extreme Data Breach Losses: An Alternative Approach to Estimating Probable Maximum Loss for Data Breach Risk","authors":"Kwangmin Jung","doi":"10.1080/10920277.2021.1919145","DOIUrl":"https://doi.org/10.1080/10920277.2021.1919145","url":null,"abstract":"This study proposes a measure of the data breach risk’s probable maximum loss, which stands for the worst data breach loss likely to occur, using an alternative approach to estimating the potential loss degree of an extreme event with one of the largest private databases for data breach risk. We determine stationarity, the presence of autoregressive feature, and the Fréchet type of generalized extreme value distribution (GEV) as the best fit for data breach loss maxima series and check robustness of the model with a public dataset. We find that the predicted data breach loss likely to occur in the next five years is substantially larger than the loss estimated by the recent literature with a Pareto model. In particular, the comparison between the estimates from the recent data (after 2014) and those for the old data (before 2014) shows a significant increase with a break in the loss severity. We design a three-layer reinsurance scheme based on the probable maximum loss estimates with public–private partnership. Our findings are important for risk managers, actuaries, and policymakers concerned about the enormous cost of the next extreme cyber event.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10920277.2021.1919145","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48293207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-06-17DOI: 10.1080/10920277.2021.1911668
T. Miljkovic, B. Grün
Recent research in loss modeling resulted in a growing number of classes of statistical models as well as additional models being proposed within each class. Empirical results indicate that a range of models within or between model classes perform similarly well, as measured by goodness-of-fit or information criteria, when fitted to the same data set. This leads to model uncertainty and makes model selection a challenging task. This problem is particularly virulent if the resulting risk measures vary greatly between and within the model classes. We propose an approach to estimate risk measures that accounts for model selection uncertainty based on model averaging. We exemplify the application of the approach considering the class of composite models. This application considers 196 different left-truncated composite models previously used in the literature for loss modeling and arrives at point estimates for the risk measures that take model uncertainty into account. A simulation study highlights the benefits of this approach. The data set on Norwegian fire losses is used to illustrate the proposed methodology.
{"title":"Using Model Averaging to Determine Suitable Risk Measure Estimates","authors":"T. Miljkovic, B. Grün","doi":"10.1080/10920277.2021.1911668","DOIUrl":"https://doi.org/10.1080/10920277.2021.1911668","url":null,"abstract":"Recent research in loss modeling resulted in a growing number of classes of statistical models as well as additional models being proposed within each class. Empirical results indicate that a range of models within or between model classes perform similarly well, as measured by goodness-of-fit or information criteria, when fitted to the same data set. This leads to model uncertainty and makes model selection a challenging task. This problem is particularly virulent if the resulting risk measures vary greatly between and within the model classes. We propose an approach to estimate risk measures that accounts for model selection uncertainty based on model averaging. We exemplify the application of the approach considering the class of composite models. This application considers 196 different left-truncated composite models previously used in the literature for loss modeling and arrives at point estimates for the risk measures that take model uncertainty into account. A simulation study highlights the benefits of this approach. The data set on Norwegian fire losses is used to illustrate the proposed methodology.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10920277.2021.1911668","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43749675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-06-11DOI: 10.1080/10920277.2021.1916537
D. Buckner, K. Dowd
U.K. equity release actuaries are using a flawed approach to value the no-negative equity guarantees in their equity release mortgages. The approach they use, the discounted projection approach, incorrectly uses projected future house prices as the underlying prices in their put option pricing equations. The correct approach uses forward house prices. The discounted projection approach entails significant undervaluations of no-negative equity guarantees and overvaluations of equity release mortgages and can produce valuations that violate rational pricing principles. The discounted projection approach is also inconsistent with both actuarial and accounting standards. Our results have significant ramifications for equity release industry practice and prudential regulation.
{"title":"Discounting the Discounted Projection Approach","authors":"D. Buckner, K. Dowd","doi":"10.1080/10920277.2021.1916537","DOIUrl":"https://doi.org/10.1080/10920277.2021.1916537","url":null,"abstract":"U.K. equity release actuaries are using a flawed approach to value the no-negative equity guarantees in their equity release mortgages. The approach they use, the discounted projection approach, incorrectly uses projected future house prices as the underlying prices in their put option pricing equations. The correct approach uses forward house prices. The discounted projection approach entails significant undervaluations of no-negative equity guarantees and overvaluations of equity release mortgages and can produce valuations that violate rational pricing principles. The discounted projection approach is also inconsistent with both actuarial and accounting standards. Our results have significant ramifications for equity release industry practice and prudential regulation.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10920277.2021.1916537","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47488311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-06-03DOI: 10.1080/10920277.2021.1925823
M. Denuit
I am grateful to Jiandong Ren for providing readers with a unified treatment of compound sums with frequency component in the ða, b, 0Þ class of counting distributions, which is central to insurance studies. This offers a deeper understanding of the underlying structure of this family, compared to the separate treatment of the Poisson and negative binomial cases in the paper (the latter being treated as a Poisson mixture). Therefore, I sincerely thank Jiandong Ren for having supplemented the initial work with these brilliant ideas. As stressed at the end of the discussion, the Panjer algorithm is particularly useful to compute tail risk measures. In addition to exact calculations, the approximations derived by Denuit and Robert (2021) in terms polynomial expansions (with respect to the Gamma distribution and its associated Laguerre orthonormal polynomials or with respect to the Normal distribution and its associated Hermite polynomials when the size of the pool gets larger) may also be useful in the present context. Depending on the thickness of the tails of the loss distributions, the latter may be replaced with their Esscher transform (or exponential tilting) of negative order. Compound sums with ða, b, 0Þ frequency component are also considered as an application in that paper and the proposed method is compared with the well-established Panjer recursive algorithm.
{"title":"Reply to Jiandong Ren on Their Discussion on the Paper Titled “Size-Biased Risk Measures of Compound Sums”","authors":"M. Denuit","doi":"10.1080/10920277.2021.1925823","DOIUrl":"https://doi.org/10.1080/10920277.2021.1925823","url":null,"abstract":"I am grateful to Jiandong Ren for providing readers with a unified treatment of compound sums with frequency component in the ða, b, 0Þ class of counting distributions, which is central to insurance studies. This offers a deeper understanding of the underlying structure of this family, compared to the separate treatment of the Poisson and negative binomial cases in the paper (the latter being treated as a Poisson mixture). Therefore, I sincerely thank Jiandong Ren for having supplemented the initial work with these brilliant ideas. As stressed at the end of the discussion, the Panjer algorithm is particularly useful to compute tail risk measures. In addition to exact calculations, the approximations derived by Denuit and Robert (2021) in terms polynomial expansions (with respect to the Gamma distribution and its associated Laguerre orthonormal polynomials or with respect to the Normal distribution and its associated Hermite polynomials when the size of the pool gets larger) may also be useful in the present context. Depending on the thickness of the tails of the loss distributions, the latter may be replaced with their Esscher transform (or exponential tilting) of negative order. Compound sums with ða, b, 0Þ frequency component are also considered as an application in that paper and the proposed method is compared with the well-established Panjer recursive algorithm.","PeriodicalId":46812,"journal":{"name":"North American Actuarial Journal","volume":null,"pages":null},"PeriodicalIF":1.4,"publicationDate":"2021-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10920277.2021.1925823","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42673797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}