# Covariance of jointly distributed variables

• Jan 18th 2010, 08:46 PM
maths3086
Covariance of jointly distributed variables

Show that Cov (X,Y) = Cov (X, E(Y|X))

Using Property: E(U) = E [ E(U|V) ]]

I know that Cov (X,Y) = E (XY) - E(X)E(Y) as the first step but how do I use the property given in the question? Thanks.
• Jan 18th 2010, 10:25 PM
ANDS!
Use the fact that the probability of Y occurring given X has occurred (a general X) is simply the marginal probability distribution of Y. This differs from the normal joint conditionals, in that there is no explicit value of X being given here. But that's ok, don't need one. You just have to do some math simplification. Have you tried actually starting the right side of the equation using the formula you already know? You know that E(X|Y) is just an expression. So plug that expression in and see what you get.
• Jan 18th 2010, 11:58 PM
maths3086
Thank you for your reply ANDS. I started by expanding the RHS and then trying to show that it was the same as the LHS.

So RHS:

Cov ( X , E (Y|X) ) = E ( X . E [Y|X] ) - E(X).E (E[Y|X])

LHS:

Cov ( X , Y ) = E (XY) - E(X).E(Y)

from the property E(Y) = E(E(Y|X)) so

= E (XY) - E(X). E(E(Y|X))

= E(X . E [Y|X] ) - E(X). E(E(Y|X))

I am not sure how I got the bold underlined part above. Is this a rule?

Can anyone explain the red boxed section of this image?

Thanks.
• Jan 20th 2010, 02:55 AM
Moo
Hello,

For the bold underline thing, see the formal definition here : Conditional expectation - Wikipedia, the free encyclopedia
which exactly says what you've written.

As for the boxed thing, it's because X is obviously $\sigma(X)$-measurable. It implies what is stated : $E[XY|X]=XE[Y|X]$