Hi,

Can anyone please help with the question?

Suppose E(U|X) = X^2. Suppose that E(X) = 0, Var(X) = 1 and E(X^3) = 0

What is the E(UX)?

Thanks alot for helping!

Printable View

- Sep 16th 2010, 09:04 AMcuteylionExpected Value
Hi,

Can anyone please help with the question?

Suppose E(U|X) = X^2. Suppose that E(X) = 0, Var(X) = 1 and E(X^3) = 0

What is the E(UX)?

Thanks alot for helping! - Sep 16th 2010, 09:48 AMMoo
Hello,

:) - Sep 16th 2010, 11:17 PMaman_cc
@ Moo - Thanks for your response.

I do understand your solution.

But I am not confortable with all the theorms/or rather properties of Expectation (in case of joint distributions). I basically get confused and at times rely on intuition.

Will it be possible for you to reccomend me

1. A source/book I can read to understand this better

2. For e.g. Why is

E(Y) = E(E(Y|X))

And then how you got

E[E[UX|X]]=E[XE[U|X]] - I understand it intuitively but not rigorously.

Thanks - Sep 17th 2010, 12:50 AMmatheagle
E(Y) = E(E(Y|X))

is easy to prove, just write the conditional expectation and integrate/sum a second time - Sep 17th 2010, 01:08 AMaman_cc
Thanks.

But where I'm weak is a better understanding of Joint Distributions. I tried referring some online resources but couldn't get a very clear idea. And, I'm not attending any university so little hard for me to find out where can you read up on it. Thanks - Sep 17th 2010, 02:06 AMcuteylion
Thanks for your help!

- Sep 17th 2010, 08:26 AMmatheagle
In the continuous setting....

now take the expectation wrt x...

and wave the Fubini wand...

- Sep 17th 2010, 12:31 PMMoo
aman_cc,

I have a very formal proof of the fact that but it's kind of long, and very theoretical. You also need measure theory knowledge. Well I don't think it's a relevant one...

The general idea is to say that in a probability space and given a sigma-algebra , and X a random variable , then there exists a unique random variable such that for any .

In particular, for , we get the desired equality.

Note that when we write , it's in fact , where is the sigma-algebra generated by . So it's in place of in the above paragraph.

Also note that with the Aussie singer's method, matheagle, we have to assume that these random variables have a pdf.

For the second point, the proof doesn't look that difficult, it's just long and painful... I don't know how to explain it with words... - Sep 19th 2010, 10:36 PMaman_cc
Thanks. I did read up about this proof in particular.

If there is a good book at an intermediate level to read up on probabolity thoery and expectations, distribution (joint) etc please do refer that to me. I like books which start with clear definitions of concepts / axioms and then proceed to prove such theorms for there. You can consider me a novice at measure theory. - Sep 20th 2010, 12:52 AMMoo
Hi,

I know three books in English that look good. But I don't know if they're the best. My library isn't specialized in English books :D

Amazon.com: Probability with Martingales (Cambridge Mathematical Textbooks) (9780521406055): David Williams: Books

Amazon.com: Probability and Measure, 3rd Edition (9780471007104): Patrick Billingsley: Books

Measure, Integral and Probability: Amazon.fr: Marek Capinski, Peter E. Kopp, Ekkehard Kopp: Livres en anglais

well they're expensive, but I hope the library in your university has some of them !