# Don't know where to start...

• Feb 13th 2009, 06:01 PM
synclastica_86
Don't know where to start...
This is what I need to do but don't know where to start, please give me some pointers. It has to do with expected value of nonnegative random variable.

Let $\displaystyle X\in {0,1,2,...}$
Show that $\displaystyle E(X)=\sum_{k=0}^{\infty}1-F_X(k)$

I know that $\displaystyle E(X)=\sum_{k=0}^{\infty}xp(k)$,
but what is $\displaystyle p(k)$and$\displaystyle F_X(k)$?

Where do i even start?
• Feb 14th 2009, 09:13 AM
Moo
Hello,
Quote:

Originally Posted by synclastica_86
This is what I need to do but don't know where to start, please give me some pointers. It has to do with expected value of nonnegative random variable.

Let $\displaystyle X\in {0,1,2,...}$
Show that $\displaystyle E(X)=\sum_{k=0}^{\infty}1-F_X(k)$

I know that $\displaystyle E(X)=\sum_{k=0}^{\infty} {\color{red}k} p(k)$,
but what is $\displaystyle p(k)$ and$\displaystyle F_X(k)$?

Where do i even start?

I don't understand.. You're given this problem, but you don't even know what p(k) and $\displaystyle F_X(k)$ are ?
That's not normal !!

Anyway, in general, $\displaystyle p(k)=P(X=k)$ is the probability distribution, and $\displaystyle F_X(k)=P(X \leq k)$ is the cumulative distribution function.

Hence $\displaystyle 1-F_X(k)=P(X>k)$
Since $\displaystyle X \in \{0,1,\dots\}$, $\displaystyle \forall k \in \{0,1,\dots\}, ~ 1-F_X(k)=P(X>k)=p(k+1)+p(k+2)+\dots=\sum_{j=k+1}^\in fty p(j)$

\displaystyle \begin{aligned} 1-F_X(0) &=\sum_{j=1}^\infty p(j)= &p(1)+ &p(2)+&p(3)+\dots+p(n)+\dots \hfill \\ 1-F_X(1) &=\sum_{j=2}^\infty p(j)=& &p(2)+&p(3)+\dots+p(n)+\dots \hfill \\ 1-F_X(2) &=\sum_{j=3}^\infty p(j)= & & &p(3)+\dots+p(n)+\dots \hfill \\ \vdots \hfill & & & & \hfill \\ \end{aligned}

Can you see a pattern when you sum these ?

$\displaystyle (1-F_X(0))+(1-F_X(1))=p(1)+2 \sum_{j=2}^\infty p(j)$
$\displaystyle (1-F_X(0))+(1-F_X(1))+(1-F_X(2))=p(1)+2p(2)+3 \sum_{j=3}^\infty p(j)$

Now it looks logical that $\displaystyle \sum_{k=0}^\infty (1-F_X(k))=\sum_{k=1}^\infty kp(k)=\sum_{k=0}^\infty kp(k)=E(X)$

But I can't manage to find a way to prove it closely. Maybe I'll find tomorrow, or someone else will (Worried)

-----------------------------------------------------
You can prove by induction, for $\displaystyle n \in \{0,1,2,\dots\}$, that :
$\displaystyle \sum_{k=0}^n (1-F_X(k))=\left[\sum_{k=1}^n kp(k)\right]+(n+1) \sum_{k=n+1}^\infty p(k)$
Then take n to infinity... But I don't know how to deal with it :s
• Feb 14th 2009, 09:35 AM
Laurent
Quote:

Originally Posted by Moo
You can prove by induction, for $\displaystyle n \in \{0,1,2,\dots\}$, that :
$\displaystyle \sum_{k=0}^n (1-F_X(k))=\left[\sum_{k=1}^n kp(k)\right]+(n+1) \sum_{k=n+1}^\infty p(k)$
Then take n to infinity... But I don't know how to deal with it :s

If $\displaystyle E[X]=\infty$, then the right-hand side goes to infinity (it is greater than $\displaystyle \sum_{k=1}^n kp(k)$), so that the equality holds.

Suppose now $\displaystyle E[X]<\infty$. In order to take the limit as $\displaystyle n\to\infty$, we have to justify $\displaystyle n \sum_{k=n}^\infty p(k)\to_n 0$.

This can be written $\displaystyle n P(X\geq n)\to_n 0$. However, we have $\displaystyle n P(X\geq n)\leq E[X {\bf 1}_{(X\geq n)}]$, and the bounded convergence theorem shows that the right-hand sides converges to 0 since $\displaystyle E[X]<\infty$.

-------------

There's another solution without taking a limit. This involves Fubini theorem for double series.

Indeed, if you write

$\displaystyle \sum_{k=0}^\infty (1-F_X(k))=\sum_{k=0}^\infty \sum_{i=k+1}^\infty p(i) = \sum_{k=0}^\infty \sum_{i=0}^\infty {\bf 1}_{(i>k)} p(i)$,

the equality amounts to interchange the sum signs, I let you get convinced of this.
• Feb 14th 2009, 04:06 PM
synclastica_86
Quote:

Originally Posted by Laurent
If $\displaystyle E[X]=\infty$, then the right-hand side goes to infinity (it is greater than $\displaystyle \sum_{k=1}^n kp(k)$), so that the equality holds.

Suppose now $\displaystyle E[X]<\infty$. In order to take the limit as $\displaystyle n\to\infty$, we have to justify $\displaystyle n \sum_{k=n}^\infty p(k)\to_n 0$.

This can be written $\displaystyle n P(X\geq n)\to_n 0$. However, we have $\displaystyle n P(X\geq n)\leq E[X {\bf 1}_{(X\geq n)}]$, and the bounded convergence theorem shows that the right-hand sides converges to 0 since $\displaystyle E[X]<\infty$.

-------------

There's another solution without taking a limit. This involves Fubini theorem for double series.

Indeed, if you write

$\displaystyle \sum_{k=0}^\infty (1-F_X(k))=\sum_{k=0}^\infty \sum_{i=k+1}^\infty p(i) = \sum_{k=0}^\infty \sum_{i=0}^\infty {\bf 1}_{(i>k)} p(k)$,

the equality amounts to interchange the sum signs, I let you get convinced of this.

Thanks for all the help all. I know this might sound stupid, it's the first time that I did any kind of stats, but I thought Fubini's Theorem is used on integrals in vector calculus. How did you turn that into the 2 sums? I understand the reasoning behind what moo did the first time the most. Am I wrong to think so?
• Feb 15th 2009, 12:23 AM
Laurent
Quote:

Originally Posted by synclastica_86
Thanks for all the help all. I know this might sound stupid, it's the first time that I did any kind of stats, but I thought Fubini's Theorem is used on integrals in vector calculus. How did you turn that into the 2 sums? I understand the reasoning behind what moo did the first time the most. Am I wrong to think so?

What Moo did is also "Fubini" (interchanging sums) but for a finite sum; in this case we don't say Fubini because it is always true by simple properties of the sum.

Let me repeat what Moo wrote in a pretty way:
Quote:

\displaystyle \begin{aligned} 1-F_X(0) &=\sum_{j=1}^\infty p(j)= &p(1)+ &p(2)+&p(3)+\dots+p(n)+\dots \hfill \\ 1-F_X(1) &=\sum_{j=2}^\infty p(j)=& &p(2)+&p(3)+\dots+p(n)+\dots \hfill \\ 1-F_X(2) &=\sum_{j=3}^\infty p(j)= & & &p(3)+\dots+p(n)+\dots \hfill \\ \vdots \hfill & & & & \hfill \\ \end{aligned}

If you sum line after line (there are infinitely many), you get $\displaystyle \sum_{k=0}^\infty (1-F_X(k))$.

If you sum column after column, you get $\displaystyle 1p(1)+2p(2)+3p(3)+\cdots = \sum_{k=1}^\infty k p(k)$.

In order to justify why these sums are the same, you need a theorem. Fubini theorem holds for double integrals indeed, but there is a very similar theorem for double sums, and I think it is customary to call it Fubini as well.

Anyway, this theorem says that if we have nonnegative numbers $\displaystyle a_{ij}\geq 0$ (like in this case), then we can interchange the sum signs (whether the sum is finite or not):

$\displaystyle \sum_i\sum_j a_{ij}=\sum_j\sum_i a_{ij}$.

This amounts to adding the columns or the lines in the previous sketch.

NB: The theorem also gives a condition for doing the same in case there are negative numbers:

if $\displaystyle \sum_i \sum_j |a_{ij}|<\infty$, then $\displaystyle \sum_i \sum_j a_{ij} = \sum_j \sum_i a_{ij}$.

--
You may prefer not to use this theorem about double series and sum only $\displaystyle n+1$ lines of the previous sketch to get the equality Moo gives:

$\displaystyle \sum_{k=0}^n (1-F_X(k))=\left[\sum_{k=1}^n kp(k)\right]+(n+1) \sum_{k=n+1}^\infty p(k).$

Then you need to take the limit as $\displaystyle n$ tends to $\displaystyle \infty$. I gave a possible justification of the limits in the first part of my previous post.
• Feb 15th 2009, 12:35 AM
Moo
Hmmm ... I was vainly trying to deal with Fubini, but there was something wrong... :
Quote:

$\displaystyle \sum_{k=0}^\infty (1-F_X(k))=\sum_{k=0}^\infty \sum_{i=k+1}^\infty p(i) = \sum_{k=0}^\infty \sum_{i=0}^\infty {\bf 1}_{(i>k)} p({\color{red}k})$
Isn't the red k an i ? :D

Thereafter, it gives... very easily the solution... That's great ! Thinking of the indicator function was brilliant lol.

Thanks for that Fubini stuff, Laurent. :p
• Feb 15th 2009, 01:57 AM
Laurent
Quote:

Originally Posted by Moo
Hmmm ... I was vainly trying to deal with Fubini, but there was something wrong... :

Isn't the red k an i ? :D

Yes, indeed... (Blush) I'm about to correct the mistake.