Hi,

Can anyone please help with the question?

Suppose E(U|X) = X^2. Suppose that E(X) = 0, Var(X) = 1 and E(X^3) = 0

What is the E(UX)?

Thanks alot for helping!

Printable View

- Sep 16th 2010, 08:04 AMcuteylionExpected Value
Hi,

Can anyone please help with the question?

Suppose E(U|X) = X^2. Suppose that E(X) = 0, Var(X) = 1 and E(X^3) = 0

What is the E(UX)?

Thanks alot for helping! - Sep 16th 2010, 08:48 AMMoo
Hello,

$\displaystyle E[UX]=E[E[UX|X]]=E[XE[U|X]]=E[X^3]$

:) - Sep 16th 2010, 10:17 PMaman_cc
@ Moo - Thanks for your response.

I do understand your solution.

But I am not confortable with all the theorms/or rather properties of Expectation (in case of joint distributions). I basically get confused and at times rely on intuition.

Will it be possible for you to reccomend me

1. A source/book I can read to understand this better

2. For e.g. Why is

E(Y) = E(E(Y|X))

And then how you got

E[E[UX|X]]=E[XE[U|X]] - I understand it intuitively but not rigorously.

Thanks - Sep 16th 2010, 11:50 PMmatheagle
E(Y) = E(E(Y|X))

is easy to prove, just write the conditional expectation and integrate/sum a second time - Sep 17th 2010, 12:08 AMaman_cc
Thanks.

But where I'm weak is a better understanding of Joint Distributions. I tried referring some online resources but couldn't get a very clear idea. And, I'm not attending any university so little hard for me to find out where can you read up on it. Thanks - Sep 17th 2010, 01:06 AMcuteylion
Thanks for your help!

- Sep 17th 2010, 07:26 AMmatheagle
In the continuous setting....

$\displaystyle E(Y|X)=\int yf_{Y|X}(y)dy$

$\displaystyle =\int y{f_{X,Y}(x,y)\over f_X(x)}dy$

now take the expectation wrt x...

$\displaystyle E(E(Y|X))=\int E(Y|X)f_X(x)dx$

and wave the Fubini wand...

$\displaystyle E(E(Y|X))=\int \int y{f_{X,Y}(x,y)\over f_X(x)}dyf_X(x)dx$

$\displaystyle =\int \int yf_{X,Y}(x,y)dydx=E(Y)$ - Sep 17th 2010, 11:31 AMMoo
aman_cc,

I have a very formal proof of the fact that $\displaystyle E[E[X|Y]]=E[X]$ but it's kind of long, and very theoretical. You also need measure theory knowledge. Well I don't think it's a relevant one...

The general idea is to say that in a probability space $\displaystyle \Omega,\mathcal A,\mathbb P)$ and given a sigma-algebra $\displaystyle \mathcal B$, and X a random variable $\displaystyle \in L^1(\Omega,\mathcal A,\mathbb P)$, then there exists a unique random variable $\displaystyle Z=E[X|\mathcal B$ such that for any $\displaystyle B\in \mathcal B,\int_B Z d\mathbb P=\int_B Xd\mathbb P$.

In particular, for $\displaystyle B=\Omega$, we get the desired equality.

Note that when we write $\displaystyle E[X|Y]$, it's in fact $\displaystyle E[X|\sigma(Y)$, where $\displaystyle \sigma(Y)$ is the sigma-algebra generated by $\displaystyle Y$. So it's in place of $\displaystyle \mathcal B$ in the above paragraph.

Also note that with the Aussie singer's method, matheagle, we have to assume that these random variables have a pdf.

For the second point, the proof doesn't look that difficult, it's just long and painful... I don't know how to explain it with words... - Sep 19th 2010, 09:36 PMaman_cc
Thanks. I did read up about this proof in particular.

If there is a good book at an intermediate level to read up on probabolity thoery and expectations, distribution (joint) etc please do refer that to me. I like books which start with clear definitions of concepts / axioms and then proceed to prove such theorms for there. You can consider me a novice at measure theory. - Sep 19th 2010, 11:52 PMMoo
Hi,

I know three books in English that look good. But I don't know if they're the best. My library isn't specialized in English books :D

Amazon.com: Probability with Martingales (Cambridge Mathematical Textbooks) (9780521406055): David Williams: Books

Amazon.com: Probability and Measure, 3rd Edition (9780471007104): Patrick Billingsley: Books

Measure, Integral and Probability: Amazon.fr: Marek Capinski, Peter E. Kopp, Ekkehard Kopp: Livres en anglais

well they're expensive, but I hope the library in your university has some of them !