# Thread: Discrete Random Variable and Expectation

1. ## Discrete Random Variable and Expectation

Let X be a discrete random variable that can only take non-negative integers (i.e. $\displaystyle R_x \subset \mathbb{N}$).

Show that $\displaystyle \mathbb{E}(X) = \sum_{x=0}^{\infty}\mathbb{P}\{X>x\}$.

I don't know how to work this out and I would appreciate any help. I know that $\displaystyle \sum_{x=0}^{\infty}\mathbb{P}\{X>x\} = \sum_{x=0}^{\infty} 1 - F(x)$ where F(x) is the CDF but I cannot see how even that gives the expectation. Any help would be appreciated and thanks in advance.

2. Originally Posted by slevvio
Let X be a discrete random variable that can only take non-negative integers (i.e. $\displaystyle R_x \subset \mathbb{N}$).

Show that $\displaystyle \mathbb{E}(X) = \sum_{x=0}^{\infty}\mathbb{P}\{X>x\}$.

I don't know how to work this out and I would appreciate any help. I know that $\displaystyle \sum_{x=0}^{\infty}\mathbb{P}\{X>x\} = \sum_{x=0}^{\infty} 1 - F(x)$ where F(x) is the CDF but I cannot see how even that gives the expectation. Any help would be appreciated and thanks in advance.
The simplest proof consists in noting that $\displaystyle X=\sum_{k=0}^\infty {\rm 1}_{\{X>k\}}$, where $\displaystyle {\rm 1}_{\{X>k\}}$ (called an indicator function) is a random variable that equals 1 if $\displaystyle k<X$ and 0 else. This is just because the $\displaystyle X$ first indicator functions equal 1 and the others equal 0.
Then, by monotone convergence theorem and linearity of the expectation, $\displaystyle E[X]=\sum_{k=0}^\infty E\left[{\rm 1}_{\{X>k\}}\right]$, and $\displaystyle E\left[ {\rm 1}_{\{X>k\}} \right]=P(X>k)$ as you can check (you integrate 1 when $\displaystyle X>k$ and 0 else...). So this is it.