Results 1 to 4 of 4

Math Help - Conditional expectation

  1. #1
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    Conditional expectation

    Hi !

    Okay, this problem has been bugging me... I guess I have some problems with conditional things

    Let X and Y two independent rv's, both following a uniform distribution over [0,1]
    Define Z=\max\{0,Y-X\}

    Find the conditional expectation \mathbb{E}[Z|X]. Then find the conditional distribution of Z such that X=x.

    hem... I was thinking about splitting the expectation and keeping the part where Y>X (since it's 0 elsewhere).
    But I'm drawing a blank there



    Also, is it more logical to first find the conditional distribution before the conditional expectation ?
    And would someone be kind enough to give me the general guidelines when dealing with such a problem ?

    Thanks for any help !
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    Hi !

    Okay, this problem has been bugging me... I guess I have some problems with conditional things

    Let X and Y two independent rv's, both following a uniform distribution over [0,1]
    Define Z=\max\{0,Y-X\}

    Find the conditional expectation \mathbb{E}[Z|X]. Then find the conditional distribution of Z such that X=x.

    hem... I was thinking about splitting the expectation and keeping the part where Y>X (since it's 0 elsewhere).
    But I'm drawing a blank there
    It is the correct idea. Like: we have Z=(Y-X){\bf 1}_{(Y>X)}, hence E[Z|X]= E[(Y-X){\bf 1}_{(Y>X)}|X]. And to compute this expectation, you consider X as a constant, and integrate with respect to the distribution of Y (given X), which is uniform on [0,1] (because X and Y are independent). Thus, you can write E[Z|X]=\int_0^1 (y-X){\bf 1}_{(y>X)}dy = \int_X^1 (y-X) dy etc. Perhaps you would prefer writing E[Z|X=x] and use little x afterward; sometimes this avoids confusion.


    Also, is it more logical to first find the conditional distribution before the conditional expectation ?
    It depends, just like usual expectation. Sometimes it is much shorter to find the conditional expectation. Sometimes (like in cases with densities), it is however simpler to give the conditional distribution, because the cond. expec. requires the cond. distr. and needs an extra integration.

    By the way, it is possible to deduce the definition of conditional expec. from that of the conditional distr. (but the existence of conditional distributions is very delicate), and it is also possible to define conditional exp. only. (there are several simpler proofs of existence).

    If (X,Z) has a density f_{(X,Z)} and you need the law of Z given X, you know the formula for that ( f_{Z|X=x}(z)=\frac{f_{(X,Z)}(x,z)}{f_X(x)}).

    If X is discrete, you may condition by \{X=x\}, hence no specific problem.

    In other cases, you can procede along the definition: if, for all measurable g:\mathbb{R}\to\mathbb{R}_+, E[g(Z)]=\cdots = \int E[g(Y_x)] d\mu_X(x) (for some family of r.v. Y_x, x\in \mathbb{R}), then the conditional law of Z given X=x is the law of Y_x. This may get messy.

    In the present case, the easiest way is definitely to compute the conditional distribution function. For all 0\leq t\leq 1, P(Z<t|X)=P(Z=0|X)+P(0<Y-X<t|X)=\cdots. Compute the expectations depending on (X,Y) by considering X as a constant (because of the independence).

    In other situations, a conditional characteristic function could be used as well. Or a conditional moment generating function, or whatever suits better... Any tool for finding distributions, as long as the computations are manageable.

    Finally, you can check your work by computing the cond. expectation from the cond. distribution.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Okay, thanks for your explanations
    Yes, we use first X=x, then transform it into X.

    There are some stuff we didn't study yet (and we've finished the conditional part of the class)

    This is what we did today, basically :
    1/ There's a property that says that for any (well defined) function \varphi, and if X and Y are independent, \mathbb{E}[\varphi(X,Y)\mid X=x]=\mathbb{E}[\varphi(x,Y)]

    which gives what you did.


    2/ For the conditional distribution, since X and Y are independent, a property says that the conditional distribution of \varphi(X,Y) given X=x is the distribution of \varphi(x,Y)
    And then we compute the distribution of this new rv.


    3/ If we know the conditional distribution of, let's say Y given X=x, we can compute \mathbb{E}[f(Y) \mid X=x] for any function f that is bounded or positive (that's what we're told...)
    So it's not immediate to get the conditional expectation from it, since the identity function is not bounded or positive in any case.
    So we have to refer to 1/ in this case.


    Looks a bit complicated, but strangely, I understand it better when there are a lot of properties rather than learning intuitive stuff... But I mostly understand what you wrote Again thanks !
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    3/ If we know the conditional distribution of, let's say Y given X=x, we can compute \mathbb{E}[f(Y) \mid X=x] for any function f that is bounded or positive (that's what we're told...)
    So it's not immediate to get the conditional expectation from it, since the identity function is not bounded or positive in any case.
    So we have to refer to 1/ in this case.
    As you guess, I don't quite agree with that... By definition, \mathbb{E}[f(Y) \mid X=x] is the expectation of f with respect to some probability measure, namely the conditional distribution of Y given X=x. Therefore the same restrictions apply as for any expectation: as soon as E[|f(Y)|\, |X=x]<\infty or f(Y)\geq 0, we may define E[f(Y)\mid X=x]. (By separating positive and negative parts of f in the first case)

    Anyway, in the present situation, 0\leq Z\leq 1 so that both restrictions are fulfilled!

    By the way, when you define E[Y|X=x] directly, you also require something about Y, like being integrable, or positive.
    Looks a bit complicated, but strangely, I understand it better when there are a lot of properties rather than learning intuitive stuff...
    Not so strange, I'm just like you, however I thought you had the "formal properties" part already...

    Ciao,
    Laurent.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 0
    Last Post: July 22nd 2011, 02:39 AM
  2. Conditional Expectation
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 29th 2011, 06:01 PM
  3. Conditional Expectation
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: January 29th 2010, 09:34 PM
  4. Expectation & Conditional Expectation
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: February 1st 2009, 11:42 AM
  5. Conditional Expectation
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: December 9th 2007, 09:22 PM

Search Tags


/mathhelpforum @mathhelpforum