# Math Help - Expected Value

1. ## Expected Value

Hi,

Suppose E(U|X) = X^2. Suppose that E(X) = 0, Var(X) = 1 and E(X^3) = 0

What is the E(UX)?

Thanks alot for helping!

2. Hello,

$E[UX]=E[E[UX|X]]=E[XE[U|X]]=E[X^3]$

3. @ Moo - Thanks for your response.

I do understand your solution.

But I am not confortable with all the theorms/or rather properties of Expectation (in case of joint distributions). I basically get confused and at times rely on intuition.

Will it be possible for you to reccomend me
1. A source/book I can read to understand this better
2. For e.g. Why is
E(Y) = E(E(Y|X))
And then how you got
E[E[UX|X]]=E[XE[U|X]] - I understand it intuitively but not rigorously.

Thanks

4. E(Y) = E(E(Y|X))
is easy to prove, just write the conditional expectation and integrate/sum a second time

5. Thanks.

But where I'm weak is a better understanding of Joint Distributions. I tried referring some online resources but couldn't get a very clear idea. And, I'm not attending any university so little hard for me to find out where can you read up on it. Thanks

6. Thanks for your help!

7. In the continuous setting....

$E(Y|X)=\int yf_{Y|X}(y)dy$

$=\int y{f_{X,Y}(x,y)\over f_X(x)}dy$

now take the expectation wrt x...

$E(E(Y|X))=\int E(Y|X)f_X(x)dx$

and wave the Fubini wand...

$E(E(Y|X))=\int \int y{f_{X,Y}(x,y)\over f_X(x)}dyf_X(x)dx$

$=\int \int yf_{X,Y}(x,y)dydx=E(Y)$

8. aman_cc,

I have a very formal proof of the fact that $E[E[X|Y]]=E[X]$ but it's kind of long, and very theoretical. You also need measure theory knowledge. Well I don't think it's a relevant one...

The general idea is to say that in a probability space $\Omega,\mathcal A,\mathbb P)$ and given a sigma-algebra $\mathcal B$, and X a random variable $\in L^1(\Omega,\mathcal A,\mathbb P)$, then there exists a unique random variable $Z=E[X|\mathcal B$ such that for any $B\in \mathcal B,\int_B Z d\mathbb P=\int_B Xd\mathbb P$.
In particular, for $B=\Omega$, we get the desired equality.

Note that when we write $E[X|Y]$, it's in fact $E[X|\sigma(Y)$, where $\sigma(Y)$ is the sigma-algebra generated by $Y$. So it's in place of $\mathcal B$ in the above paragraph.

Also note that with the Aussie singer's method, matheagle, we have to assume that these random variables have a pdf.

For the second point, the proof doesn't look that difficult, it's just long and painful... I don't know how to explain it with words...

If there is a good book at an intermediate level to read up on probabolity thoery and expectations, distribution (joint) etc please do refer that to me. I like books which start with clear definitions of concepts / axioms and then proceed to prove such theorms for there. You can consider me a novice at measure theory.

10. Hi,

I know three books in English that look good. But I don't know if they're the best. My library isn't specialized in English books

Amazon.com: Probability with Martingales (Cambridge Mathematical Textbooks) (9780521406055): David Williams: Books
Amazon.com: Probability and Measure, 3rd Edition (9780471007104): Patrick Billingsley: Books
Measure, Integral and Probability: Amazon.fr: Marek Capinski, Peter E. Kopp, Ekkehard Kopp: Livres en anglais

well they're expensive, but I hope the library in your university has some of them !