Originally Posted by

**sharpe** Hello,

I am working in the insurance industry. One of the problems we have is how to work out how various risk diversify with one another. One approach used is to hold a certain amount of capital with respect to each risk the company faces. Then work out a correlation factor between each risk.

For example, an insurance company might have three risks; say, Equity risk, interest rate risk, mortality. The company might think it needs 100, 50, 20 for each risk individually.

We take a matrix A = (x1, x2, x3) = (100, 50, 20) in this case.

The correlation matrix between each risk might look like:

1 0.5 0

0.5 1 0

0 0 1

We call this matrix B. We can calculate the amount of capital the insurer holder allowing for diversification by doing:

Diversified capital = Sqrt(Transpose(A) * B * A )

One question I had is there must be an optimal level for each x that minimises the diversified capital relative to the sum of the x's (sum of x's in example = 100+50+20). [In practice this would give the optimal balance of risks the insurer should take to get the most diversification benefit]. There is a numerical answer using solver in excel, but I wanted a closed form solution.

I had a go at this and tried differentiating the diversified capital / sum of x's and set this equal to zero (as a minimum). This works and gives a closed form answer. However, we have a constraint that is that the minimum for any of the x's is zero. Again this can be solved using solver in Excel. However is there a closed solution allowing for the constraint that the x's must be bigger than zero?

I was thinking the likely way to look at this is quadratic programming; in which case using solver in excel is actually the best way to do it.

Any help would be most appreciated!