Independant Bernoulli Random Varible

• Feb 21st 2013, 04:28 AM
wrexlive1
Independant Bernoulli Random Varible
SupposeY1; Y2; Y3; : : : are independent Bernoulli random variables with

P(Yi = 0) = 1 =(1/i!)
P(Yi = 1) = 1=i!

Defining X=Y1+Y2+Y3+...
Show that E[x]=e-1

Not to sure how to do this queston, so wondering anyone can help me?

• Feb 21st 2013, 10:00 AM
Siron
Re: Independant Bernoulli Random Varible
Note that the expected value $E$ has linear properties which we can use here. Unfortunately I'm confused with your notations. Can you explain what you mean with $1=\frac{1}{i!}$? We know that $Y_1,Y_2, \ldots$ are discrete randiom variables, hence $E[Y_i] = \sum_{x} x p_x$. Finally, $E[X] = \sum_{i=1}^{\infty} E[Y_i]$
• Feb 21st 2013, 10:25 AM
wrexlive1
Re: Independant Bernoulli Random Varible
Sorry there has been a typing error, it should read as
P(Yi = 0) = 1 -(1/i!)
P(Yi = 1) = 1/i!
• Feb 21st 2013, 10:38 AM
Siron
Re: Independant Bernoulli Random Varible
Quote:

Originally Posted by wrexlive1
Sorry there has been a typing error, it should read as
P(Yi = 0) = 1 -(1/i!)
P(Yi = 1) = 1/i!

Okay. We can compute the expected value of $Y_i$ as follows $E[Y_i] = 0\left(1-\frac{1}{i!}\right)+1\left(\frac{1}{i!}\right)$ $=\frac{1}{i!}$.
Therefore, $E[X]=\sum_{i=1}^{\infty} E[Y_i]=\sum_{i=1}^{\infty} \frac{1}{i!} = \sum_{i=0}^{\infty} \frac{1}{i!}-1$. Using the fact that $e^x = \sum_{i=0}^{\infty} \frac{x^i}{i!}$ we have $\sum_{i=0}^{\infty} \frac{1}{i!}=e$. Hence, $E[X]=e-1$.