Originally Posted by

**arcketer** Here's the problem I'm bugged about: An insurance company has 5000 policies and assumes these policies are all independent. Each policy is governed by the same distribution with a mean of $495 and a variance of $30,000. What is the probability that the total claims for the year will be less than $2,500,000?

So I calculate the expected value of the 5000 policies as E(5000X) = 5000*E(X) = 5000*495 = 2,475,000.

Then, the next step is to calculate the variance V(5000X) = 5000^2 * V(X) = 25,000,000 * 30,000 = 750,000,000,000. However, I am told that this is not the correct way to calculate the variance in this case. The property of V(aX) = a^2 * V(X) is one I've used for awhile and gotten comfortable with. Why does it not apply in this case?