ASTIN Bulletin
http://poj.peeters-leuven.be/content.php?url=journal&journal_code=AST
Recent articlesDevelopment Pattern and Prediction Error for the Stochastic Bornhuetter-Ferguson Claims Reserving Methodpoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136979
http://poj.peeters-leuven.be/content.php?url=article&id=2136979
Thu, 22 Dec 2011 10:54:30 GMT
We investigate the question how the development pattern in the Bornhuetter-Ferguson method should be estimated and derive the corresponding conditional mean square error of prediction (MSEP) of the ultimate claim prediction. An estimator of this conditional MSEP in a distribution-free model was given by Mack [9], whereas in Alai et al. [2] this conditional MSEP was studied in an over-dispersed Poisson model using the chain ladder development pattern. First we consider distributional models and derive estimators (maximum likelihood) for the development pattern taking all relevant information into account. Moreover, we suggest new estimators of the correlation matrix of these estimators and new estimators of the conditional MSEP. Our findings supplement some of Mack’s results. The methodology is illustrated at two numerical examples.
Market-Consistent Valuation of Insurance Liabilities by Cost of Capitalpoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136980
http://poj.peeters-leuven.be/content.php?url=article&id=2136980
Thu, 22 Dec 2011 11:48:11 GMT
This paper investigates market-consistent valuation of insurance liabilities in the context of Solvency II among others and to some extent IFRS 4. We propose an explicit and consistent framework for the valuation of insurance liabilities which incorporates the Solvency II approach as a special case. The proposed framework is based on replication over multiple (one-year) time periods by a periodically updated portfolio of assets with reliable market prices, allowing for 'limited liability' in the sense that the replication can in general not always be continued. The asset portfolio consists of two parts: (1) assets whose market price defines the value of the insurance liabilities, and (2) capital funds used to cover risk which cannot be replicated. The capital funds give rise to capital costs; the main exogenous input in the framework is the condition on when the investment of the capital funds is acceptable. We investigate existence of the value and show that the exact calculation of the value has to be done recursively backwards in time, starting at the end of the lifetime of the insurance liabilities. We derive upper bounds on the value and, for the special case of replication by risk-free one-year zero-coupon bonds, explicit recursive formulas for calculating the value. In the paper, we only partially consider the question of the uniqueness of the value. Valuation in Solvency II and IFRS 4 is based on representing the value as a sum of a 'best estimate' and a 'risk margin'. In our framework, it turns out that this split is not natural. Nonetheless, we show that a split can be constructed as a simplification, and that it provides an upper bound on the value under suitable conditions. We illustrate the general results by explicitly calculating the value for a simple example.
The Impact of Genetic Information on the Insurance Industrypoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136981
http://poj.peeters-leuven.be/content.php?url=article&id=2136981
Thu, 22 Dec 2011 11:53:39 GMT
We quantify the overall impact of genetic information on the insurance industry using the 'bottom-up' approach, in which detailed models are constructed of representative major genetic disorders. We consider six such disorders, namely adult polycystic kidney disease, early-onset Alzheimer's disease, Huntington's disease, myotonic dystrophy (MD), hereditary non-polyposis colorectal cancer; and breast/ovarian cancer. Actuarial models based on the epidemiological literature exist for all these except MD. We parameterise a suitable model of MD, then synthesize the results from all six models to estimate the adverse selection costs arising from restrictions on insurers' use of genetic information. These are all very small, only in the most extreme cases rising above 1% of premiums. In the worst case - females displaying 'extreme' adverse selection in a 'small' critical illness insurance market, with the use of family history banned - the cost is about 3% of premiums. Our model includes the most common single-gene disorders relevant to insurance, and includes representatives of most important classes of these disorders. While the 'bottom-up' approach could be continued by modelling more and more diseases, we suggest that our model is adequate to draw robust conclusions.
Modelling Adult Mortuality in Small Populationspoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136982
http://poj.peeters-leuven.be/content.php?url=article&id=2136982
Thu, 22 Dec 2011 11:55:49 GMT
The mortality evolution of small populations often exhibits substantial variability and irregular improvement patterns making it hard to identify underlying trends and produce plausible projections. We propose a methodology for robust forecasting based on the existence of a larger reference population sharing the same long-term trend as the population of interest. The reference population is used to estimate the parameters in a frailty model for the underlying intensity surface. A multivariate time series model describing the deviations of the small population mortality from the underlying mortality is then fitted and forecasted. Coherent long-term forecasts are ensured by the underlying frailty model while the size and variability of short- to medium-term deviations are quantified by the time series model. The frailty model is particularly well suited to describe the changing improvement patterns in old age mortality. We apply the method to Danish mortality data with a pooled international data set as reference population.
Modelling and Forecasting the Mortality of the Very Oldpoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136983
http://poj.peeters-leuven.be/content.php?url=article&id=2136983
Thu, 22 Dec 2011 11:57:13 GMT
The forecasting of the future mortality of the very old presents additional challenges since data quality can be poor at such ages. We consider a two-factor model for stochastic mortality, proposed by Cairns, Blake and Dowd, which is particularly well suited to forecasting at very high ages. We consider an extension to their model which improves fit and also allows forecasting at these high ages. We illustrate our methods with data from the Continuous Mortality Investigation.
Fair Valuation of Life Insurance Contracts under a Correlated Jump Diffusion Modelpoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136984
http://poj.peeters-leuven.be/content.php?url=article&id=2136984
Thu, 22 Dec 2011 11:59:14 GMT
In this paper, we study the fair valuation of participating life insurance contract, which is one of the most common life insurance products, under the jump diffusion model with the consideration of default risk. The participating life insurance contracts considered here can be expressed as portfolios of options as shown by Grosen and Jørgensen (1997). We use the Laplace transforms methods to price these options.
Dependent Loss Reserving Using Copulaspoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136985
http://poj.peeters-leuven.be/content.php?url=article&id=2136985
Thu, 22 Dec 2011 12:02:34 GMT
Modeling dependencies among multiple loss triangles has important implications for the determination of loss reserves, a critical element of risk management and capital allocation practices of property-casualty insurers. In this article, we propose a copula regression model for dependent lines of business that can be used to predict unpaid losses and hence determine loss reserves. The proposed method, relating the payments in different run-off triangles through a copula function, allows the analyst to use flexible parametric families for the loss distribution and to understand the associations among lines of business. Based on the copula model, a parametric bootstrap procedure is developed to incorporate the uncertainty in parameter estimates. To illustrate this method, we consider an insurance portfolio consisting of personal and commercial automobile lines. When applied to the data of a major US property-casualty insurer, our method provides comparable point prediction of unpaid losses with the industry's standard practice, chain-ladder estimates. Moreover, our flexible structure allows us to easily compute the entire predictive distribution of unpaid losses. This procedure also readily yields accident year reserves, calendar year reserves, as well as the aggregate reserves. One important implication of the dependence modeling is that it allows analysts to quantify the diversification effects in risk capital analysis. We demonstrate these effects by calculating commonly used risk measures, including value at risk and conditional tail expectation, for the insurer's combined portfolio of personal and commercial automobile lines.
Optimal Reinsurance under VaR and CVaR Risk Measurespoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136986
http://poj.peeters-leuven.be/content.php?url=article&id=2136986
Thu, 22 Dec 2011 12:04:51 GMT
In this paper, we study two classes of optimal reinsurance models by minimizing the total risk exposure of an insurer under the criteria of value at risk (VaR) and conditional value at risk (CVaR). We assume that the reinsurance premium is calculated according to the expected value principle. Explicit solutions for the optimal reinsurance policies are derived over ceded loss functions with increasing degrees of generality. More precisely, we establish formally that under the VaR minimization model, (i) the stop-loss reinsurance is optimal among the class of increasing convex ceded loss functions; (ii) when the constraints on both ceded and retained loss functions are relaxed to increasing functions, the stop-loss reinsurance with an upper limit is shown to be optimal; (iii) and finally under the set of general increasing and left-continuous retained loss functions, the truncated stop-loss reinsurance is shown to be optimal. In contrast, under CVaR risk measure, the stop-loss reinsurance is shown to be always optimal. These results suggest that the VaR-based reinsurance models are sensitive with respect to the constraints imposed on both ceded and retained loss functions while the corresponding CVaR-based reinsurance models are quite robust.
The Impact of Stochastic Volatility on Pricing, Hedging, and Hedge Efficiency of Withdrawal Benefit Guarantees in Variable Annuitiespoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136987
http://poj.peeters-leuven.be/content.php?url=article&id=2136987
Thu, 22 Dec 2011 12:07:12 GMT
We analyze different types of guaranteed withdrawal benefits for life, the latest guarantee feature within variable annuities. Besides an analysis of the impact of different product features on the clients' payoff profile, we focus on pricing and hedging of the guarantees. In particular, we investigate the impact of stochastic equity volatility on pricing and hedging. We consider different dynamic hedging strategies for delta and vega risks and compare their performance. We also examine the effects if the hedging model (with deterministic volatility) differs from the data-generating model (with stochastic volatility). This is an indication for the model risk an insurer takes by assuming constant equity volatilities for risk management purposes, whereas in the real world volatilities are stochastic.
Optimal Reinsurance Revisitedpoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136988
http://poj.peeters-leuven.be/content.php?url=article&id=2136988
Thu, 22 Dec 2011 12:10:23 GMT
It is known that the partial stop-loss contract is an optimal reinsurance form under the VaR risk measure. Assuming that market premiums are set according to the expected value principle with varying loading factors, the optimal reinsurance parameters of this contract are obtained under three alternative single and joint party reinsurance criteria: (i) strong minimum of the total retained loss VaR measure; (ii) weak minimum of the total retained loss VaR measure and maximum of the reinsurer’s expected profit; (iii) weak minimum of the total retained loss VaR measure and minimum of the total variance risk measure. New conditions for financing in the mean simultaneously the cedent's and the reinsurer's required VaR economic capital are revealed for situations of pure risk transfer (classical reinsurance) or risk and profit transfer (design of internal reinsurance or reinsurance captive owned by the captive of a corporate firm).
Modelling Dependence in Insurance Claims Process with Lévy Copulaspoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136989
http://poj.peeters-leuven.be/content.php?url=article&id=2136989
Thu, 22 Dec 2011 12:12:31 GMT
In this paper we investigate the potential of Lévy copulas as a tool for modelling dependence between compound Poisson processes and their applications in insurance. We analyse characteristics regarding the dependence in frequency and dependence in severity allowed by various Lévy copula models. Through the introduction of new Lévy copulas and comparison with the Clayton Lévy copula, we show that Lévy copulas allow for a great range of dependence structures. Procedures for analysing the fit of Lévy copula models are illustrated by fitting a number of Lévy copulas to a set of real data from Swiss workers compensation insurance. How to assess the fit of these models with respect to the dependence structure exhibited by the dataset is also discussed. Finally, we provide a decomposition of the trivariate compound Poisson process and discuss how trivariate Lévy copulas model dependence in this multivariate setting.
Optimal Dividends and Capital Injections in the Dual Model with Diffusionpoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136990
http://poj.peeters-leuven.be/content.php?url=article&id=2136990
Thu, 22 Dec 2011 12:14:43 GMT
The dual model with diffusion is appropriate for companies with continuous expenses that are offset by stochastic and irregular gains. Examples include research-based or commission-based companies. In this context, Avanzi and Gerber (2008) showed how to determine the expected present value of dividends, if a barrier strategy is followed. In this paper, we further include capital injections and allow for (proportional) transaction costs both on dividends and capital injections. We determine the optimal dividend and (unconstrained) capital injection strategy (among all possible strategies) when jumps are hyperexponential. This strategy happens to be either a dividend barrier strategy without capital injections, or another dividend barrier strategy with forced injections when the surplus is null to prevent ruin. The latter is also shown to be the optimal dividend and capital injection strategy, if ruin is not allowed to occur. Both the choice to inject capital or not and the level of the optimal barrier depend on the parameters of the model. In all cases, we determine the optimal dividend barrier and show its existence and uniqueness. We also provide closed form representations of the value functions when the optimal strategy is applied. Results are illustrated.
Randomized Onservation Periods for the Compound Poisson Risk Model: Dividendspoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136991
http://poj.peeters-leuven.be/content.php?url=article&id=2136991
Thu, 22 Dec 2011 12:16:18 GMT
In the framework of the classical compound Poisson process in collective risk theory, we study a modification of the horizontal dividend barrier strategy by introducing random observation times at which dividends can be paid and ruin can be observed. This model contains both the continuous-time and the discrete-time risk model as a limit and represents a certain type of bridge between them which still enables the explicit calculation of moments of total discounted dividend payments until ruin. Numerical illustrations for several sets of parameters are given and the effect of random observation times on the performance of the dividend strategy is studied.
Using the Censored Gamma Distribution for Modelling Fractional Response Variables with an Application to Loss Given Defaultpoj@peeters-leuven.behttp://dx.doi.org/10.2143/AST.41.2.2136992
http://poj.peeters-leuven.be/content.php?url=article&id=2136992
Thu, 22 Dec 2011 12:18:01 GMT
Regression models for limited continuous dependent variables having a non-negligible probability of attaining exactly their limits are presented. The models differ in the number of parameters and in their flexibility. Fractional data being a special case of limited dependent data, the models also apply to variables that are a fraction or a proportion. It is shown how to fit these models and they are applied to a Loss Given Default dataset from insurance to which they provide a good fit.