# Thread: How to get the covariance of x and |x|----cov(x,|x|)?

1. ## How to get the covariance of x and |x|----cov(x,|x|)?

How to get the covariance of x and |x|----cov(x,|x|)? under the condition of continuous X or discrete X, thanks!

2. Hello,

cov(X,|X|)=E[X*|X|]-E[X]E[|X|]...

But then it can be simplified or not, depending on what your exercise really is. Please state the context...

3. thanks for your quick reply, The context is that the E(X) is the given, but E(|x|) is unknown, when X, as a random variable, can be located in both the positive interval[0，inf) and negative interval(-inf,0), what's the results of Cov(X,|x|)? I encounter a problem that is X is a random variable, so it can jump between positive and negative case, so whether we can simply consider the covariance in the two case? if not, what's the result? Thanks!

4. thanks for your quick reply, The context is that the E(X) has been given, but E(|x|) is unknown. When X, as a random variable, can be located in both the positive interval[0，inf) and negative interval(-inf,0), what's the results of Cov(X,|x|)? I encounter a problem that is X is a random variable, so it can jump between positive and negative case, so whether we can simply consider the covariance in positive case x>0 and negative case x<0 separately? if not, what's the result? Thanks!

5. Hello,

You can't consider it if X>0 or not. You have to condition by it.

Okay, let's set $sgn(X)=-1$ if $X<0$ and $sgn(X)=1$ if $X\geq 0$. In other words $sgn(X)=\bold{1}_{X\geq 0}-\bold{1}_{X<0}$ (it's a random variable following a simple discrete distribution, let's say p is the probability that it equals 1)

Then $cov(|X|,X)=E[sgn(X)X^2]-E[sgn(X)X]E[X]$

Now let's consider a conditional covariance. (sorry, I love using conditional expectations )

$cov(|X|,X \mid sgn(X))=E[sgn(X)X^2 \mid sgn(X)] - E[sgn(X)X \mid sgn(X)]E[X \mid sgn(X)]$.
Since sgn(X) is indeed sgn(X)-measurable, we can pull it out from the expectations :
$cov(|X|,X \mid sgn(X))= sgn(X) E[X^2 \mid sgn(X)]-sgn(X) (E[X \mid sgn(X)])^2 = sgn(X) Var(X \mid sgn(X))$
So $cov(|X|,X)=E[sgn(X) Var[X \mid sgn(X)]]$, which is an expectation with respect to sgn(X)'s distribution (meaning that sgn(X) is the sole remaining random variable in the expectation).
So this gives $pVar(X)-(1-p)Var(X)=(2p-1)Var(X)$, where p is, as said before, $P(sgn(X)=1)=P(X\geq 0)$

That's the furthest one can go with the provided information. And it doesn't matter whether X is discrete or continuous.

6. thank you so much for your very smart and creative answer, but I still have one more question, which is that VAR[X|sgn(x)=-1]=VAR(X)? and also VAR[X|sgn(x)=1]=VAR(X)?

7. Originally Posted by lyb66
thank you so much for your very smart and creative answer, but I still have one more question, which is that VAR[X|sgn(x)=-1]=VAR(X)? and also VAR[X|sgn(x)=1]=VAR(X)?
Yes it's equal. So we can say that $Var[X|sgn(X)]=Var[X]$, which lets us simplify :
$E[sgn(X) Var[X|sgn(X)]=E[sgn(X) Var[X]]=Var[X] E[sgn(X)]$ , since Var[X] is a constant !

If you understand my method, I have some doubts about your thread belonging to the pre-university subforum

8. acctually, http://latex.codecogs.com/png.latex?Var[X|sgn(X)]=Var[X] this equation is not hold when I use integration to solve cov(x,|x|), the two final results are not consistent...please check it again to avoid future mistake...