# How to get the covariance of x and |x|----cov(x,|x|)?

• May 31st 2011, 02:07 PM
lyb66
How to get the covariance of x and |x|----cov(x,|x|)?
How to get the covariance of x and |x|----cov(x,|x|)? under the condition of continuous X or discrete X, thanks!
• May 31st 2011, 03:14 PM
Moo
Hello,

cov(X,|X|)=E[X*|X|]-E[X]E[|X|]...

But then it can be simplified or not, depending on what your exercise really is. Please state the context...
• May 31st 2011, 04:40 PM
lyb66
thanks for your quick reply, The context is that the E(X) is the given, but E(|x|) is unknown, when X, as a random variable, can be located in both the positive interval[0，inf) and negative interval(-inf,0), what's the results of Cov(X,|x|)? I encounter a problem that is X is a random variable, so it can jump between positive and negative case, so whether we can simply consider the covariance in the two case? if not, what's the result? Thanks!
• May 31st 2011, 04:50 PM
lyb66
thanks for your quick reply, The context is that the E(X) has been given, but E(|x|) is unknown. When X, as a random variable, can be located in both the positive interval[0，inf) and negative interval(-inf,0), what's the results of Cov(X,|x|)? I encounter a problem that is X is a random variable, so it can jump between positive and negative case, so whether we can simply consider the covariance in positive case x>0 and negative case x<0 separately? if not, what's the result? Thanks!
• Jun 1st 2011, 02:28 AM
Moo
Hello,

You can't consider it if X>0 or not. You have to condition by it.

Okay, let's set \$\displaystyle sgn(X)=-1\$ if \$\displaystyle X<0\$ and \$\displaystyle sgn(X)=1\$ if \$\displaystyle X\geq 0\$. In other words \$\displaystyle sgn(X)=\bold{1}_{X\geq 0}-\bold{1}_{X<0}\$ (it's a random variable following a simple discrete distribution, let's say p is the probability that it equals 1)

Then \$\displaystyle cov(|X|,X)=E[sgn(X)X^2]-E[sgn(X)X]E[X]\$

Now let's consider a conditional covariance. (sorry, I love using conditional expectations :D)

\$\displaystyle cov(|X|,X \mid sgn(X))=E[sgn(X)X^2 \mid sgn(X)] - E[sgn(X)X \mid sgn(X)]E[X \mid sgn(X)]\$.
Since sgn(X) is indeed sgn(X)-measurable, we can pull it out from the expectations :
\$\displaystyle cov(|X|,X \mid sgn(X))= sgn(X) E[X^2 \mid sgn(X)]-sgn(X) (E[X \mid sgn(X)])^2 = sgn(X) Var(X \mid sgn(X))\$
So \$\displaystyle cov(|X|,X)=E[sgn(X) Var[X \mid sgn(X)]]\$, which is an expectation with respect to sgn(X)'s distribution (meaning that sgn(X) is the sole remaining random variable in the expectation).
So this gives \$\displaystyle pVar(X)-(1-p)Var(X)=(2p-1)Var(X)\$, where p is, as said before, \$\displaystyle P(sgn(X)=1)=P(X\geq 0)\$

That's the furthest one can go with the provided information. And it doesn't matter whether X is discrete or continuous.
• Jun 5th 2011, 08:53 AM
lyb66
thank you so much for your very smart and creative answer, but I still have one more question, which is that VAR[X|sgn(x)=-1]=VAR(X)? and also VAR[X|sgn(x)=1]=VAR(X)?
• Jun 12th 2011, 12:36 AM
Moo
Quote:

Originally Posted by lyb66
thank you so much for your very smart and creative answer, but I still have one more question, which is that VAR[X|sgn(x)=-1]=VAR(X)? and also VAR[X|sgn(x)=1]=VAR(X)?

Yes it's equal. So we can say that \$\displaystyle Var[X|sgn(X)]=Var[X]\$, which lets us simplify :
\$\displaystyle E[sgn(X) Var[X|sgn(X)]=E[sgn(X) Var[X]]=Var[X] E[sgn(X)]\$ , since Var[X] is a constant !

If you understand my method, I have some doubts about your thread belonging to the pre-university subforum :D
• Jun 13th 2011, 09:40 AM
lyb66
acctually, http://latex.codecogs.com/png.latex?Var[X|sgn(X)]=Var[X] this equation is not hold when I use integration to solve cov(x,|x|), the two final results are not consistent...please check it again to avoid future mistake...