# Thread: Covariance of jointly distributed variables

1. ## Covariance of jointly distributed variables

Show that Cov (X,Y) = Cov (X, E(Y|X))

Using Property: E(U) = E [ E(U|V) ]]

I know that Cov (X,Y) = E (XY) - E(X)E(Y) as the first step but how do I use the property given in the question? Thanks.

2. Use the fact that the probability of Y occurring given X has occurred (a general X) is simply the marginal probability distribution of Y. This differs from the normal joint conditionals, in that there is no explicit value of X being given here. But that's ok, don't need one. You just have to do some math simplification. Have you tried actually starting the right side of the equation using the formula you already know? You know that E(X|Y) is just an expression. So plug that expression in and see what you get.

3. Thank you for your reply ANDS. I started by expanding the RHS and then trying to show that it was the same as the LHS.

So RHS:

Cov ( X , E (Y|X) ) = E ( X . E [Y|X] ) - E(X).E (E[Y|X])

LHS:

Cov ( X , Y ) = E (XY) - E(X).E(Y)

from the property E(Y) = E(E(Y|X)) so

= E (XY) - E(X). E(E(Y|X))

= E(X . E [Y|X] ) - E(X). E(E(Y|X))

I am not sure how I got the bold underlined part above. Is this a rule?

Can anyone explain the red boxed section of this image?

Thanks.

4. Hello,

For the bold underline thing, see the formal definition here : Conditional expectation - Wikipedia, the free encyclopedia
which exactly says what you've written.

As for the boxed thing, it's because X is obviously $\displaystyle \sigma(X)$-measurable. It implies what is stated : $\displaystyle E[XY|X]=XE[Y|X]$