SupposeY1; Y2; Y3; : : : are independent Bernoulli random variables with

P(Yi = 0) = 1 =(1/i!)

P(Yi = 1) = 1=i!

Defining X=Y1+Y2+Y3+...Show that E[x]=e-1

Not to sure how to do this queston, so wondering anyone can help me?

Printable View

- Feb 21st 2013, 03:28 AMwrexlive1Independant Bernoulli Random Varible
SupposeY1; Y2; Y3; : : : are independent Bernoulli random variables with

P(Yi = 0) = 1 =(1/i!)

P(Yi = 1) = 1=i!

Defining X=Y1+Y2+Y3+...Show that E[x]=e-1

Not to sure how to do this queston, so wondering anyone can help me?

- Feb 21st 2013, 09:00 AMSironRe: Independant Bernoulli Random Varible
Note that the expected value $\displaystyle E$ has linear properties which we can use here. Unfortunately I'm confused with your notations. Can you explain what you mean with $\displaystyle 1=\frac{1}{i!}$? We know that $\displaystyle Y_1,Y_2, \ldots$ are discrete randiom variables, hence $\displaystyle E[Y_i] = \sum_{x} x p_x$. Finally, $\displaystyle E[X] = \sum_{i=1}^{\infty} E[Y_i]$

- Feb 21st 2013, 09:25 AMwrexlive1Re: Independant Bernoulli Random Varible
Sorry there has been a typing error, it should read as

P(Yi = 0) = 1 -(1/i!)

P(Yi = 1) = 1/i! - Feb 21st 2013, 09:38 AMSironRe: Independant Bernoulli Random Varible
Okay. We can compute the expected value of $\displaystyle Y_i$ as follows $\displaystyle E[Y_i] = 0\left(1-\frac{1}{i!}\right)+1\left(\frac{1}{i!}\right) $$\displaystyle =\frac{1}{i!}$.

Therefore, $\displaystyle E[X]=\sum_{i=1}^{\infty} E[Y_i]=\sum_{i=1}^{\infty} \frac{1}{i!} = \sum_{i=0}^{\infty} \frac{1}{i!}-1$. Using the fact that $\displaystyle e^x = \sum_{i=0}^{\infty} \frac{x^i}{i!}$ we have $\displaystyle \sum_{i=0}^{\infty} \frac{1}{i!}=e$. Hence, $\displaystyle E[X]=e-1$.