# Conditional expectation

• Oct 12th 2009, 11:45 AM
Moo
Conditional expectation
Hi !

Okay, this problem has been bugging me... I guess I have some problems with conditional things :(

Let X and Y two independent rv's, both following a uniform distribution over [0,1]
Define $Z=\max\{0,Y-X\}$

Find the conditional expectation $\mathbb{E}[Z|X]$. Then find the conditional distribution of Z such that X=x.

hem... I was thinking about splitting the expectation and keeping the part where Y>X (since it's 0 elsewhere).
But I'm drawing a blank there (Crying)

Also, is it more logical to first find the conditional distribution before the conditional expectation ?
And would someone be kind enough to give me the general guidelines when dealing with such a problem ?

Thanks for any help !
• Oct 12th 2009, 02:49 PM
Laurent
Quote:

Originally Posted by Moo
Hi !

Okay, this problem has been bugging me... I guess I have some problems with conditional things :(

Let X and Y two independent rv's, both following a uniform distribution over [0,1]
Define $Z=\max\{0,Y-X\}$

Find the conditional expectation $\mathbb{E}[Z|X]$. Then find the conditional distribution of Z such that X=x.

hem... I was thinking about splitting the expectation and keeping the part where Y>X (since it's 0 elsewhere).
But I'm drawing a blank there (Crying)

It is the correct idea. Like: we have $Z=(Y-X){\bf 1}_{(Y>X)}$, hence $E[Z|X]= E[(Y-X){\bf 1}_{(Y>X)}|X]$. And to compute this expectation, you consider $X$ as a constant, and integrate with respect to the distribution of $Y$ (given $X$), which is uniform on $[0,1]$ (because $X$ and $Y$ are independent). Thus, you can write $E[Z|X]=\int_0^1 (y-X){\bf 1}_{(y>X)}dy = \int_X^1 (y-X) dy$ etc. Perhaps you would prefer writing $E[Z|X=x]$ and use little x afterward; sometimes this avoids confusion.

Quote:

Also, is it more logical to first find the conditional distribution before the conditional expectation ?
It depends, just like usual expectation. Sometimes it is much shorter to find the conditional expectation. Sometimes (like in cases with densities), it is however simpler to give the conditional distribution, because the cond. expec. requires the cond. distr. and needs an extra integration.

By the way, it is possible to deduce the definition of conditional expec. from that of the conditional distr. (but the existence of conditional distributions is very delicate), and it is also possible to define conditional exp. only. (there are several simpler proofs of existence).

If $(X,Z)$ has a density $f_{(X,Z)}$ and you need the law of $Z$ given $X$, you know the formula for that ( $f_{Z|X=x}(z)=\frac{f_{(X,Z)}(x,z)}{f_X(x)}$).

If $X$ is discrete, you may condition by $\{X=x\}$, hence no specific problem.

In other cases, you can procede along the definition: if, for all measurable $g:\mathbb{R}\to\mathbb{R}_+$, $E[g(Z)]=\cdots = \int E[g(Y_x)] d\mu_X(x)$ (for some family of r.v. $Y_x$, $x\in \mathbb{R}$), then the conditional law of $Z$ given $X=x$ is the law of $Y_x$. This may get messy.

In the present case, the easiest way is definitely to compute the conditional distribution function. For all $0\leq t\leq 1$, $P(Z. Compute the expectations depending on $(X,Y)$ by considering $X$ as a constant (because of the independence).

In other situations, a conditional characteristic function could be used as well. Or a conditional moment generating function, or whatever suits better... Any tool for finding distributions, as long as the computations are manageable.

Finally, you can check your work by computing the cond. expectation from the cond. distribution.
• Oct 14th 2009, 11:46 AM
Moo
Okay, thanks for your explanations :)
Yes, we use first X=x, then transform it into X.

There are some stuff we didn't study yet (and we've finished the conditional part of the class)

This is what we did today, basically :
1/ There's a property that says that for any (well defined) function $\varphi$, and if X and Y are independent, $\mathbb{E}[\varphi(X,Y)\mid X=x]=\mathbb{E}[\varphi(x,Y)]$

which gives what you did.

2/ For the conditional distribution, since X and Y are independent, a property says that the conditional distribution of $\varphi(X,Y)$ given X=x is the distribution of $\varphi(x,Y)$
And then we compute the distribution of this new rv.

3/ If we know the conditional distribution of, let's say Y given X=x, we can compute $\mathbb{E}[f(Y) \mid X=x]$ for any function f that is bounded or positive (that's what we're told...)
So it's not immediate to get the conditional expectation from it, since the identity function is not bounded or positive in any case.
So we have to refer to 1/ in this case.

Looks a bit complicated, but strangely, I understand it better when there are a lot of properties rather than learning intuitive stuff... But I mostly understand what you wrote :) Again thanks !
• Oct 14th 2009, 12:16 PM
Laurent
Quote:

Originally Posted by Moo
3/ If we know the conditional distribution of, let's say Y given X=x, we can compute $\mathbb{E}[f(Y) \mid X=x]$ for any function f that is bounded or positive (that's what we're told...)
So it's not immediate to get the conditional expectation from it, since the identity function is not bounded or positive in any case.
So we have to refer to 1/ in this case.

As you guess, I don't quite agree with that... By definition, $\mathbb{E}[f(Y) \mid X=x]$ is the expectation of $f$ with respect to some probability measure, namely the conditional distribution of $Y$ given $X=x$. Therefore the same restrictions apply as for any expectation: as soon as $E[|f(Y)|\, |X=x]<\infty$ or $f(Y)\geq 0$, we may define $E[f(Y)\mid X=x]$. (By separating positive and negative parts of $f$ in the first case)

Anyway, in the present situation, $0\leq Z\leq 1$ so that both restrictions are fulfilled!

By the way, when you define $E[Y|X=x]$ directly, you also require something about $Y$, like being integrable, or positive.
Quote:

Looks a bit complicated, but strangely, I understand it better when there are a lot of properties rather than learning intuitive stuff...
Not so strange, I'm just like you, however I thought you had the "formal properties" part already... (Giggle)

Ciao,
Laurent.