1. ## Proof

Let X and Y be discrete random variables and a; b are real numbers.
Prove E(aX + bY ) = aE(X)+ bE(Y ) and Var(aX + bY ) = a^2Var(X)+
b^2Var(Y )+ 2abCov(X; Y ).

Both these seem easy but I can't proof why E(X+Y)=E(X)+E(Y)

2. Originally Posted by whatsanihar
Let X and Y be discrete random variables and a; b are real numbers.
Prove E(aX + bY ) = aE(X)+ bE(Y ) and Var(aX + bY ) = a^2Var(X)+
b^2Var(Y )+ 2abCov(X; Y ).

Both these seem easy but I can't proof why E(X+Y)=E(X)+E(Y)
Look at the definition of expectation for discrete RVs:

$\displaystyle E(f(U))=\sum f(u_i) p(u_i)$

So (assuming suitable conditions allowing us to change orders of summation etc hold):

$\displaystyle E(X+Y)=\sum_{i,j} (x_i+y_j) p(x_i,y_j) = \sum_i \sum_j x_i p(x_i,y_j) + \sum_j \sum_i y_j p(x_i,y_j)$ $\displaystyle =\sum_i x_i p(x_i) +\sum_j y_j p(y_j)$

(here we are writing the marginal distribution $\displaystyle p(x_i)=\sum_j p(x_i,y_j)$ and similar for the other case)

etc

CB

3. Originally Posted by CaptainBlack
Look at the definition of expectation for discrete RVs:

$\displaystyle E(f(U))=\sum f(u_i) p(u_i)$

So (assuming suitable conditions allowing us to change orders of summation etc hold):

$\displaystyle E(X+Y)=\sum_{i,j} (x_i+y_j) p(x_i,y_j) = \sum_i \sum_j x_i p(x_i,y_j) + \sum_j \sum_i y_j p(x_i,y_j)$ $\displaystyle =\sum_i x_i p(x_i) +\sum_j y_j p(y_j)$

(here we are writing the marginal distribution $\displaystyle p(x_i)=\sum_j p(x_i,y_j)$ and similar for the other case)

etc

CB
Ya thats what I did for the expected value and got that. But what should I do for Var(X+Y)? Var(x)=E[(x-u)^2] but what is Var(X+Y) equal to?

4. Originally Posted by CaptainBlack
Look at the definition of expectation for discrete RVs:

$\displaystyle E(f(U))=\sum f(u_i) p(u_i)$

So (assuming suitable conditions allowing us to change orders of summation etc hold):

$\displaystyle E(X+Y)=\sum_{i,j} (x_i+y_j) p(x_i,y_j) = \sum_i \sum_j x_i p(x_i,y_j) + \sum_j \sum_i y_j p(x_i,y_j)$ $\displaystyle =\sum_i x_i p(x_i) +\sum_j y_j p(y_j)$

(here we are writing the marginal distribution $\displaystyle p(x_i)=\sum_j p(x_i,y_j)$ and similar for the other case)

etc

CB
We can of course shorten this by observing that by definition:

$\displaystyle E(X)+E(Y)=\sum_i \sum_j x_i p(x_i,y_j) + \sum_j \sum_i y_j p(x_i,y_j)$

CB

5. Originally Posted by whatsanihar
Ya thats what I did for the expected value and got that. But what should I do for Var(X+Y)? Var(x)=E[(x-u)^2] but what is Var(X+Y) equal to?
$\displaystyle Var(X+Y)=E([(X+Y)-(\overline{X}+\overline{Y})]^2$ $\displaystyle =E([(X-\overline{X})+(Y-\overline{Y})]^2)$ $\displaystyle =E((X-\overline{X})^2 + 2(X-\overline{X})(Y-\overline{Y})+(Y-\overline{Y})^2)$

Now use what has already been proven (that the expectation of a sum is the sum of the expectations) to get:

$\displaystyle Var(X+Y)=$ $\displaystyle =Var(X)+Var(Y) + 2 CoVar(X,Y)$

CB

6. But isnt that saying that Var(X+Y)=Var(X)+Var(Y)?

When Var(X+Y)=Var(X)+Var(Y)+2Cov(V,Y)

7. Originally Posted by whatsanihar
But isnt that saying that Var(X+Y)=Var(X)+Var(Y)?

When Var(X+Y)=Var(X)+Var(Y)+2Cov(V,Y)
Sorry you got to see a work in progress, go back and look at the final form.

CB

8. Thanks, Thanks. It makes sense now .

,

,

,

,

,

,

,

,

,

,

,

,

,

,

# var(ax by)=var(ax) var(by) 2cov(ax by) proof

Click on a term to search for related topics.