# Random Variables

• Oct 2nd 2009, 01:21 PM
azdang
Random Variables
Given $\displaystyle (\Omega, \mathcal{A}, P)$, suppose X is a random variable with $\displaystyle X\geq0$ and E{X}=1. Definite $\displaystyle Q:\mathcal{A}->R$ by Q(A)=$\displaystyle E{(X1_A)}$. Show that if P(A) = 0, then Q(A)=0. Give an example that shows that Q(A)=0 does not in general imply P(A)=0.

I'm pretty sure that $\displaystyle E{(1_A)}=P(A)$, so I was thinking that maybe we could break Q(A) up into $\displaystyle E{(X)}E{(1_A)}$. However, this would be assuming that X and 1_A are independent random variables? Would this be true? If so, then it's very obvious why, if P(A)=0, then Q(A)=0. Anyone have any hints? Thanks!
• Oct 4th 2009, 02:20 AM
Laurent
Quote:

Originally Posted by azdang
Given $\displaystyle (\Omega, \mathcal{A}, P)$, suppose X is a random variable with $\displaystyle X\geq0$ and E{X}=1. Definite $\displaystyle Q:\mathcal{A}->R$ by Q(A)=$\displaystyle E{(X1_A)}$. Show that if P(A) = 0, then Q(A)=0. Give an example that shows that Q(A)=0 does not in general imply P(A)=0.

I'm pretty sure that $\displaystyle E{(1_A)}=P(A)$, so I was thinking that maybe we could break Q(A) up into $\displaystyle E{(X)}E{(1_A)}$. However, this would be assuming that X and 1_A are independent random variables? Would this be true? If so, then it's very obvious why, if P(A)=0, then Q(A)=0. Anyone have any hints? Thanks!

It is false (without further assumption) that $\displaystyle Q(A)=E{(X)}E{(1_A)}$. However, if $\displaystyle P(A)=0$, then $\displaystyle X 1_A=0$ almost-everywhere (it is 0 outside $\displaystyle A$, and $\displaystyle A$ is negligible), hence $\displaystyle Q(A)=0$.

As for the example, you must choose $\displaystyle X$ in such a way that $\displaystyle X 1_A$ is zero everywhere, i.e. $\displaystyle X$ is 0 on $\displaystyle A$, and it is chosen outside A in such a way that $\displaystyle E[X](=E[X 1_{A^c}])=1$).
• Oct 4th 2009, 05:31 PM
azdang
Ooh, the first part seems so obvious. Thank you, Laurent. :)

As for the second part, I'm still a little lost. My first question is: Is it correct that $\displaystyle X1_A$ = X if x is in A and 0 if x is not in A?
• Oct 7th 2009, 12:19 PM
Laurent
Quote:

Originally Posted by azdang
Ooh, the first part seems so obvious. Thank you, Laurent. :)

As for the second part, I'm still a little lost. My first question is: Is it correct that $\displaystyle X1_A$ = X if x is in A and 0 if x is not in A?

No. What is true is that $\displaystyle X(\omega) 1_A(\omega)=X(\omega)$ if $\displaystyle \omega\in A$ and $\displaystyle X(\omega) 1_A(\omega)=0$ else. Remember $\displaystyle A$ is an event, not a subset of $\displaystyle \mathbb{R}$.

For instance (you should write this more formally), suppose we toss a fair coin, and define the event $\displaystyle A=\{\text{the outcome is heads}\}$ and $\displaystyle X$ is defined to be 0 in case heads shows up, and to be 2 if tails shows up. Then what are $\displaystyle P(A)$, $\displaystyle E[X]$ and $\displaystyle E[X 1_A]$?
• Oct 13th 2009, 06:35 AM
azdang
Well, P(A) should be .5, correct? So, wouldn't $\displaystyle EX1_A$ be 0 in either case (if we roll a heads or a tales)? So in this case, Q(A) = 0, but P(A) does not equal 0.
• Oct 13th 2009, 06:58 AM
Laurent
Quote:

Originally Posted by azdang
Well, P(A) should be .5, correct? So, wouldn't $\displaystyle EX1_A$ be 0 in either case (if we roll a heads or a tales)? So in this case, Q(A) = 0, but P(A) does not equal 0.

Yes. (More correctly, $\displaystyle X1_A$ is 0 in both cases, hence $\displaystyle E(X1_A)=0$. )