How to get the covariance of x and |x|----cov(x,|x|)? under the condition of continuous X or discrete X, thanks!
thanks for your quick reply, The context is that the E(X) is the given, but E(|x|) is unknown, when X, as a random variable, can be located in both the positive interval[0，inf) and negative interval(-inf,0), what's the results of Cov(X,|x|)? I encounter a problem that is X is a random variable, so it can jump between positive and negative case, so whether we can simply consider the covariance in the two case? if not, what's the result? Thanks!
thanks for your quick reply, The context is that the E(X) has been given, but E(|x|) is unknown. When X, as a random variable, can be located in both the positive interval[0，inf) and negative interval(-inf,0), what's the results of Cov(X,|x|)? I encounter a problem that is X is a random variable, so it can jump between positive and negative case, so whether we can simply consider the covariance in positive case x>0 and negative case x<0 separately? if not, what's the result? Thanks!
Hello,
You can't consider it if X>0 or not. You have to condition by it.
Okay, let's set $\displaystyle sgn(X)=-1$ if $\displaystyle X<0$ and $\displaystyle sgn(X)=1$ if $\displaystyle X\geq 0$. In other words $\displaystyle sgn(X)=\bold{1}_{X\geq 0}-\bold{1}_{X<0}$ (it's a random variable following a simple discrete distribution, let's say p is the probability that it equals 1)
Then $\displaystyle cov(|X|,X)=E[sgn(X)X^2]-E[sgn(X)X]E[X]$
Now let's consider a conditional covariance. (sorry, I love using conditional expectations )
$\displaystyle cov(|X|,X \mid sgn(X))=E[sgn(X)X^2 \mid sgn(X)] - E[sgn(X)X \mid sgn(X)]E[X \mid sgn(X)]$.
Since sgn(X) is indeed sgn(X)-measurable, we can pull it out from the expectations :
$\displaystyle cov(|X|,X \mid sgn(X))= sgn(X) E[X^2 \mid sgn(X)]-sgn(X) (E[X \mid sgn(X)])^2 = sgn(X) Var(X \mid sgn(X))$
So $\displaystyle cov(|X|,X)=E[sgn(X) Var[X \mid sgn(X)]]$, which is an expectation with respect to sgn(X)'s distribution (meaning that sgn(X) is the sole remaining random variable in the expectation).
So this gives $\displaystyle pVar(X)-(1-p)Var(X)=(2p-1)Var(X)$, where p is, as said before, $\displaystyle P(sgn(X)=1)=P(X\geq 0)$
That's the furthest one can go with the provided information. And it doesn't matter whether X is discrete or continuous.
Yes it's equal. So we can say that $\displaystyle Var[X|sgn(X)]=Var[X]$, which lets us simplify :
$\displaystyle E[sgn(X) Var[X|sgn(X)]=E[sgn(X) Var[X]]=Var[X] E[sgn(X)]$ , since Var[X] is a constant !
If you understand my method, I have some doubts about your thread belonging to the pre-university subforum
acctually, http://latex.codecogs.com/png.latex?Var[X|sgn(X)]=Var[X] this equation is not hold when I use integration to solve cov(x,|x|), the two final results are not consistent...please check it again to avoid future mistake...