# Thread: Variation of a RV conditional on another - help pls

1. ## Variation of a RV conditional on another - help pls

Hello. I have the following problem to solve: prove that

Var(X)=E(Var(X¬Y)) + Var(E(X¬Y))

where ¬ denotes conditional probability. Sorry, I do not know how write on this LaTex thing

2. Hello,
Originally Posted by dely84
Hello. I have the following problem to solve: prove that

Var(X)=E(Var(X¬Y)) + Var(E(X¬Y))

where ¬ denotes conditional probability. Sorry, I do not know how write on this LaTex thing
Sorry I won't write it in Latex, I hope it will be enough.

The mean of a variable is not a variable anymore. It is a constant.
And so is the variance of a variable.

Now there are 2 properties of the mean and the variance. If a is a constant, then :
E{a}=a and Var{a}=0.

Hence E{Var(X|Y)}+Var{E(X|Y)}=Var(X|Y)+0=Var(X|Y)

So... is there some missing information or a mistake somewhere ? :s

3. Originally Posted by Moo
Hello,

Sorry I won't write it in Latex, I hope it will be enough.

The mean of a variable is not a variable anymore. It is a constant.
True, but the conditional mean given a random variable is a random variable (there probably are another few chapters before you get to that cool notion ).

To prove the equality, one should make the definitions of the right-hand side explicit: since ${\rm Var}(Z)=E[Z^2]-E[Z]^2$,

$E[{\rm Var}(X|Y)]+{\rm Var}(E[X|Y])=E[E[X^2|Y]-E[X|Y]^2]+E[E[X|Y]^2]-E[E[X|Y]]^2.$

Now, you should notice that two terms simplify, and if you remember that $E[E[Z|Y]]=E[Z]$ for any integrable $Z$, you'll notice that you're done.

4. Originally Posted by Laurent
True, but the conditional mean given a random variable is a random variable (there probably are another few chapters before you get to that cool notion ).
Sorry for the "bump" or "up"...

We got into it - not in the class I expected though -, and it's just as you said, cool
Along with some introductory Markov chains ²