# Variance of the sum of N independent variables

• Jan 31st 2009, 09:14 AM
James0502
Variance of the sum of N independent variables
Let X1,X2,...XN be independent identically distributed random variables, where N is a
non-negative integer valued random variable. Let Z = X1 + X2 + : : + XN, (assuming that Z = 0 if N = 0). Find E(Z) and show that
var(Z) = var(N)E(X1)^2 + E(N) var (X1)

Not a clue on this one! I know Expectation of Z is NE(X1)

but how do I show this result?

Many Many thanks
• Jan 31st 2009, 01:52 PM
mr fantastic
Quote:

Originally Posted by James0502
Let X1,X2,...XN be independent identically distributed random variables, where N is a
non-negative integer valued random variable. Let Z = X1 + X2 + : : + XN, (assuming that Z = 0 if N = 0). Find E(Z) and show that
var(Z) = var(N)E(X1)^2 + E(N) var (X1)

Not a clue on this one! I know Expectation of Z is NE(X1)

but how do I show this result?

Many Many thanks

You're expected to know the theorem that for independent random variables $Var\left[\sum_{i=1}^n (a_i X_i) \right]= \sum_{i=1}^n a_i^2 Var(X_i)$.

Apply this theorem to your problem.
• Jan 31st 2009, 10:59 PM
hahavincent
This is a problem of a random sum of random variables, which is very useful in actuarial science.
First, I saw u already know E(sum Xi）= N*E（Xi) (A). keep this result, it's gotta be useful later.
Second, remember var(sum Xi)=E(var(sum Xi|N=n))+var(E(sum Xi|N=n)) (B).

We will work out the second part in the right side of equation (B);
var(E(sum Xi|N=n))=var(E(sum Xi)) since Xi's and N are independent
OK, substitute the E(sum Xi) result (A）in,
and var(E(sum Xi|N=n))=var(N*E(Xi))=var(N)*(E(Xi))^2
hew...

Here come the first part in the right side of equation (B),
E(var(sum Xi|N=n))=E(N*var(X)) since Xi's are independent and X and N are indep.
then, E(var(sum Xi|N=n))=E(N)*var(X)

Combine two parts, u get the result u expected. u need to show a bit more detail when u prove the first part in equation (B). I skipped some steps. If you want to be an actuary, we should talk more! (Wink)
• Feb 1st 2009, 02:53 AM
James0502
many thanks guys!

I am still unsure how to get (B) though. I cant think of any useful identities I know to give me this result - we've only really just begun this section of the course, so I'm still very hazy!

thanks
• Feb 24th 2009, 05:28 PM
matheagle
Quote:

Originally Posted by James0502
Let X1,X2,...XN be independent identically distributed random variables, where N is a
non-negative integer valued random variable. Let Z = X1 + X2 + : : + XN, (assuming that Z = 0 if N = 0). Find E(Z) and show that
var(Z) = var(N)E(X1)^2 + E(N) var (X1)

Not a clue on this one! I know Expectation of Z is NE(X1)

but how do I show this result?

Many Many thanks

N is a random variable, known as a stopping time.
So $E(Z)=NE(X_1)$ does not make sense, the left side is constant while the right is random.
I don't know if we need independence or not between the N and the
$X_i$'s in the mean case, but we may need it, in obtaining the variance.
Most likely $E(Z)=E(N)E(X_1)$.