American Journal of Applied Mathematics
Volume 4, Issue 2, April 2016, Pages: 105-109

EVT and Its Application to Pricing of Catastrophe (Typhoon) Reinsurance

Hu Yue, Zhang Li, Shen Li-jie, Gao Li-ya

Department of Mathematics and Information Science, Zhejiang University of Science and Technology, Hangzhou, P.R. China

(Hu Yue)

Hu Yue, Zhang Li, Shen Li-jie, Gao Li-ya. EVT and Its Application to Pricing of Catastrophe (Typhoon) Reinsurance. American Journal of Applied Mathematics. Vol. 4, No. 2, 2016, pp. 105-109. doi: 10.11648/j.ajam.20160402.16

Received: March 15, 2016; Accepted: April 9, 2016; Published: April 11, 2016

Abstract: Extreme value theory is a branch of the theory of order statistics and it is a statistical study of extreme events disciplined approach. There is often extreme values in catastrophe losses, the use of traditional methods of statistical laws describe the amount of catastrophe losses would ignore the existence of extreme data. In this paper, we consider the premium of excess-of-loss reinsurance policies with different attachment points based on the idea of layered pricing using extreme value model, and we fit POT model to the typhoon loss date of Zhejiang Province to determine the pure premium of typhoon in the empirical analysis.

Keywords: Extreme Value Theory, POT, Reinsurance, Pure Premium

Contents

1. Introduction

Insurance industry is facing more and more catastrophe losses. Such as the Hurricane Andrew of the United States in1992, the big flood of China in 1998, the water seepage of the subway No.4 Line in Shanghai, the Indian Ocean tsunami in 2004, the hurricane Katrina in the United States in 2005, the snow disaster in South China at the beginning of 2008. These all brought tens or even hundreds of billion dollars in losses to humans and also caused a huge impact to the stability of the insurance industry. Therefore, the management to the extreme events has become one of the hot spots in the Risk management of insurance industry [1].

The probability of extreme events is very low, but the damage is huge. The extreme events in insurance refers to events that the probability of occurrence is very small but will cause serious even devastating damage for the insurance industry. Once these extremes events occur, a single insurance company is generally unable to bear it or caused a serious impact by it. For these companies, the results are disastrous.

Just like the Philippe, J.B. (2000) said, for the Central extreme value theorem can only apply in the central area, we can not sure if the Gauss is theorem for the extreme events. Now, it’s obviously that people are most concerned about the matter of these extreme risk, and they also want to control them firstly. Therefore, it’s rather stupid to eliminate the influence of these extreme events.

2. The Principle of Extreme Value Theory

Extreme value theory was first proposed by Fisher and Tippet [2] in 1920. And then, further analysis was carried out by Genncdenko in 1940, and the probability model of extreme value was standardized by Gumbel [3] in 1950.

Gumbel, Frechet and Weibull et al put forward three types of extreme value distribution [4]: If there is a constant sequence  and , which makes for a non degenerate distribution G that  when , then G is one of the following three types of distribution functions:

I: ,

II: ,

III: ,

If introduced a location parameter  and a scale parameter , then the three distribution can be used in a unified form to express :

(1)

Among them, , we named G as generalized extreme value distributions, abbreviated as GEV distribution. And the  is a shape parameter.

The one element extreme value model [5] can be divided into two categories according to different methods of obtaining extreme samples. They are BMM (Block Maxima Method) model and POT (peaks over Threshold) model. BMM model divide the statistical data into several groups firstly, and then take the maximum (minimum) of each group as a sample fitting [6] to obtain the generalized extreme value distribution(GEV). POT model set a certain threshold firstly, and then take the number that exceeds the threshold as a sample fitting to get the generalized Pareto distribution (GPD).

The modeling data of the BMM model is derived from the maximum value of each group, but because of the clustering property of the statistical observation, it always lead to a series of huge loss occurred in a certain period of time. These claims are larger than the maximum value of the other time. To make full use of these large values, the POT model studys the asymptotic distribution of GPD for the average overrun function which beyond a certain threshold .

Assuming that the distribution function of random variable is , the right endpoint is , and then, we named the distribution function of  as a average overrun function of the random variable x exceeds , and we also named the function : as a average overrun function. The can be used the GPD to approximate when  is large enough, at this time the  is called threshold. GDP was first proposed by Fisher and Tippet in 1975, it’s distribution function form is:

(2)

The selection of threshold is very important, unreasonable threshold fitting model may cause a significant deviation. If the threshold we chose is too large, it will lead the excess to be too small, so that the variance of parameter estimation will become greater. On the contrary, we get a biased parameter estimation. Usually there are two methods used to determine the threshold. One is the Hill [7] diagram. It is defined as a collection of .

Here, .          (3)

We select the data  in the stable region of tail index where the starting point of the cross axis is K as the threshold . The other is the Mean Excess Function. It is defined as

(4)

We make the scatter diagram of , and select a sufficiently large  to make the  approximately linear when . If the MEF of the threshold  has a positive slope, it shows that the data follow the generalized Pareto distribution which the shape parameters is positive. On the other hand, its MEF is horizontal when the data come from an exponential distribution or it is negative when the data come from a short tail distribution.

Because of the limitations of sample, we need to take a appropriate method into consideration. We can use the maximum likelihood method or the moment estimation method to estimate parameter for GPD after we confirm the threshold.

3. Catastrophe Reinsurance Pricing

In the data analysis of insured losses, the historical data of catastrophe risk is always fitted with a thick tail distribution [8]. Such as the Pareto distribution and Normal distribution. These distributions may not be well reflected in large losses catastrophe excess of loss reinsurance. It’s not suitable for reinsurance superb compensation layer pricing [9]. But the extreme value distribution can compensate for this defect.

We assume that every loss  is provided independently and distributed identically in a non-degenerate distribution F. N is a random number of loss variable within a year, and considered it obeying the poisson distribution [10], that means we select the compound Poisson model. Reinsurance contracts set attachment point is D, the reinsurers pay a total compensation of:

(5)

We named the  as the annual net premiums of reinsurance, and the  as a random variable of the number of loss within a year. Reinsurance compensation process is also a compound poisson process. Under the compound poisson model , the reinsurance net premiums that:

(6)

Thus POT model of extreme value theory can be applied to reinsurance net premium calculation. Catastrophe reinsurance contracts often set a higher attachment point, when the attachment point D in the contract is higher than or equal to threshold  which set in the POT model, the overrun function can be approximated by GPD. But actually, the attachment point D may very differently with the threshold. According that, here are three calculation method of pure premium:

When D is equal to , it can be considered that:

(7)

At this time:

(8)

Unknown parameters can be replaced by parameter estimates. Assume  as the claim frequency exceed  in one year, the estimates of pure insurance cost is:

(9)

The disadvantages of the model is that will not be able to estimated when . Because the average of GDP does not exist, it’s generally thought that this risk is not to be confirmed in the insurance.

As the D is larger than , we need to adjust the above model. One way is make D for the new threshold. As long as the threshold was tested for POT model in theory, any model about can be used as a threshold. But the observed data for catastrophe loss is very few, this intercept will lose a lot of effective data information. On the contrary, making model is not accurately. Then we take another method of adjustment. According to the nature of the poisson distribution, the number beyond of attachment point D is still a poisson distribution, compound poisson process also apply to the above assumptions, the poisson parameter need to be adjusted as the following:

(10)

The overrun function is:

(11)

The expectations of each time under D is:

(12)

Substitute estimate for the unknown parameters, we know that the pure insurance cost in this condition is:

(13)

If the attachmen point D is less than the threshold , we can't see D as a threshold. Given the extreme value theory model discard the small and middle data information, we can find a distribution to describe the loss section which before the threshold, with the distribution of the two mixing section describes the loss. The main distribution fitting usually have two methods of parameter and nonparametric. Parameter method is commonly used in gamma distribution, Pareto distribution and lognormal distribution, etc. Nonparametric method can consider the whole kernel density estimation.

For parameter method, this paper used the logarithmic normal distribution as an example, the  is less than the threshold of the mantissa, according to logarithmic normal distribution fitting  for data fitting the GDP of the tail.

When the random variable X is less than the threshold value, take the distribution function as . On the country, because of

(14)

we make as  Then the mixed distribution function can be written as follows:

is parameter space, its survival function is , then the expectations of compensate beyond the attachment point D is:

(15)

Reinsurance compensation frequency also needs to be adjusted accordingly. If still to  which exceed the threshold  frequency of  as reference,

The distribution of logarithmic normal distribution function ,  is a standard normal distribution function, density function is

,

then,

(16)

Using the empirical data of lognormal distribution parameter estimation, can get the pure insurance cost is estimated to be:

(17)

4. Empirical Study

The data of this paper comes from the Zhejiang Yearbook and the typhoon Yearbook [11]. These two databases recorded the occurrence of multiple typhoons in Zhejiang province since 1949 to 2012 and recorded the time, place and loss of each event in detail. Here is a total of 68 records about loss data. In order to minimize the deviation of the statistical standards and make the data more comparable, we adjust the loss with a standard of 2012 year according to the GDP of Zhejiang province. Specifically, the amount of damage was divided by the GDP of this year, and then multiplied by the GDP in 2012. The data of GDP is from the "Statistical Yearbook of Zhejiang". Table 1 reflects the basic descriptive statistics of the loss that caused by typhoon. We can get that 70% quantile less than the average and the variance of the data is big. In addition, the coefficient of skewness is 3.51, which illustrates a few extreme data are very large and data skews to the right. So we need to deal with these extreme data through extreme value distribution.

Table 1. Basic descriptive statistics of the loss.

 Number of observations Average (million) Standard deviation Minimum (million) 30% quantile Median 70% quantile Maximum (billion) Coefficient of skewness 68 209.98 335.37 0.64 38.53 110.84 185.79 1912.20 3.51

Correct selection of threshold is the premise of the effectiveness of the model and the correct of parameters estimation. So before the POT model is set up, we should have a discussion on the selection of the appropriate threshold interval. The two methods of threshold selection are average overrun function and Hill char. According to Hill chart method, we get the following Fig. 1:

Fig. 1. Hill Chart.

Combining with the above figure, after the order value k = 8, the graphics showing a slow decline and it can be regarded as a stable state. At this time, , therefore,  is appropriate. To sum up, we can set thresholds as 20,60 and 100. We can use maximum likelihood method to estimate the shape parameter and scale parameter and the parameter estimates obtained are shown in table 2:

Table 2. The parameter estimates of fitting GPD.

To facilitate the discussion below, we choose =60 and its corresponding parameter estimates. GDP fitting as follows:

(18)

In the mixed distribution models, we need to consider the main distribution. Here we use the lognormal distribution in the parameter method to fit the data. In order to reduce the influence of extreme value for fitting effect, we only consider the data less than the threshold. By the maximum likelihood method, we can get the parameters of the lognormal distribution is , , so the main distribution density function is

(19)

The data used in this article is the typhoon loss in Zhejiang. For a specific insurance product. We should determine the risk of extreme value distribution losses according to the historical data of the company or the local statistics. In the reinsurance contract, the attachment points set as a policy variable which is determined by the specific underwriting risk and related policies. Assume that the attachment points is the threshold, the frequency of compensation in the reinsurance is . In the practical application, we can consider the expectations of each insurance policy or insurance policies portfolios pay number in an insurance company as the frequency of compensation in the Poisson process. If the compensate point less than the threshold, we can use the mixed distribution model proposed above. According to the three cases discussed above, the result of typhoon reinsurance premium as shown in table 3:

Table 3. The typhoon reinsurance premium under different attachment points.

5. Conclusion

Extreme value theory is a kind of very important statistical methods in applied science, it has good effect at the tail of catastrophe loss data fitting. It can improve the effect of fitting to the greatest extent possible. In this paper, on the basis of extreme value theory, we build three kinds of pricing model of catastrophe reinsurance. As well as there is a big deviation between the attachment point and threshold, we focus on the consideration of how to use the extreme value theory to set pure insurance cost of excess loss risk in catastrophe reinsurance, so that the extreme value theory can be applied to the applicability of the catastrophe risk model more widely.

Acknowledgements

This work was financially supported by the science and Technology Department of Zhejiang Province of China (2015C33088, 2014R415033), Educational Commission of Zhejiang Province of China (Y201327793), Classroom teaching reform of higher education project of Zhejiang Province of China (kg 2013276) and Foundation of Education Research Zhejiang University of Science and Technology (2011039).

References

1. Mcneil, A. J. Extreme value theory for risk managers [M].Internal Modeling CaDII, Risk Books,1999, 93-113.
2. Fisher R. Tippett L. Limiting forms of the frequency distribution of the large or smallest member of a sample [J]. Procedings of the Cambridge Philosophical Society,1928(24).
3. Gumbel E. Statistics of extreme [M]. New York: Columbia University Press. 1958.
4. SHI Dao-ji, Practical extreme value statistical method, Tianjin: Tianjin science and Technology Press [M], 2006.
5. Tawn, J, A.Bivariate Extreme Value Theory-Models and Estimation [J]. Biometrika, 1988.
6. ZHAO Zhi-hong, LI Xing-xu, Fitting and Actuarial Research on Extremely Large Loss in Non-life Insurance [J], Application of Statistics and Management, 2010, 29(3): 336-347.
7. Song Jia-shan, Li yong and Peng cheng, Improvement of Hill estimation method for threshold selection in extreme value theory [J], Application of University of Science & Technology China, 2008, (9): 1104-1108.
8. OUYANG Zi-sheng, Extreme Quantile Estimation for Heavy-tailed Distribution and A study of Extremal Risk Measurement [J], Application of Statistics and Management, 2008, 27(1): 70-75.
9. XIAO Hai-qing, MENG Sheng-wang, EVT and Its Application to Pricing of Catastrophe Reinsurance [J], Application of Statistics and Management, 2013, 32(2): 1002-1566.
10. Jiang Zheng-fa and Huang xu-peng,An actuarial model of life insurance premiums based on Compound Poisson process [J]. Application of Statistics and decision making, 2014(8): 57-59.
11. THE ALMANAC OF ZHEJIANG, THE ALMANAC OF TYPHOON.

 Contents 1. 2. 3. 4. 5.
Article Tools