# Math Help - Probability Density

1. ## Probability Density

If the probability density of X is given by
f(x) = {1+x for -1< x <= 0
1-x for 0 < x < 1
0 elsewhere
and U = X and V = X^2, show that (a) cov(U,V) = 0. (b) U and V are dependent.

2. Originally Posted by TheHolly
If the probability density of X is given by
f(x) = {1+x for -1< x <= 0
1-x for 0 < x < 1
0 elsewhere
and U = X and V = X^2, show that (a) cov(U,V) = 0. (b) U and V are dependent.

If non-one replies in the meantime, I might have time later tonight (my time) to have a closer look.

3. I would think that you would use the transformation theorem to get the pdf of $V$. Its a monotonic , strictly increasing transformation.

Then show what $p_{X,Y}(x,y) \neq p_{X}(x) \cdot p_{Y}(y)$ for dependence. And use definition of covariance: $\text{Cov}(X,Y) = E[XY] - \mu_{X} \mu_{Y}$.

4. cov(U,V)=E((U-u)(V-v)) where u,v are E(U),E(V) respectfully.

u=E(U)=E(X)=0, for obvious reasons.

So we have:

cov(U,V)=E(U(V-v))
=E(UV-Uv)
=E(UV)-E(Uv)
=E(X^3)-vE(U)
=E(X^3)-vu
=E(X^3)-v.0
=E(X^3)
=0

this last step can be done: x^3 is an odd function, f(x) is an even function, so when we multiply them we get an odd function. Hence the integral of x^3.f(x) (ie the expected value of X^3) is 0.

Dependence follows...

5. Originally Posted by TheHolly
If the probability density of X is given by
f(x) = {1+x for -1< x <= 0
1-x for 0 < x < 1
0 elsewhere
and U = X and V = X^2, show that (a) cov(U,V) = 0. (b) U and V are dependent.

(b) Clearly U and V are dependent, by definition, since one variable is determined by the other.

(a) You need to find $\text{cov} (X, X^2) = E(X \cdot X^2) - E(X) \cdot E(X^2) = E(X^3) - E(X) \cdot E(X^2)$.

6. Originally Posted by Oli
cov(U,V)=E((U-u)(V-v)) where u,v are E(U),E(V) respectfully.

u=E(U)=E(X)=0, for obvious reasons.

So we have:

cov(U,V)=E(U(V-v))
=E(UV-Uv)
=E(UV)-E(Uv)
=E(X^3)-vE(U)
=E(X^3)-vu
=E(X^3)-v.0
=E(X^3)
=0

this last step can be done: x^3 is an odd function, f(x) is an even function, so when we multiply them we get an odd function. Hence the integral of x^3.f(x) (ie the expected value of X^3) is 0.

Dependence follows...
NB: Independence implies covariance = 0 but covariance = 0 does NOT imply independence.

7. Let A be the event U is in (0,0.5)
Let B be the event V is in (0,0.25).

P(A)P(B) =/= P(A,B)
(just do the sums and this is clear: you don't have to do all the sums, just enough to show this is true...)

This implies dependence.

8. Sorry, when I said dependence follows, I meant I would be adding a proof of dependence afterwards... done it now.

This is in fact a counter example: the point of the question is to show dependent variables with covariance 0.

9. Some helpful hints with this stuff:

Expected values are one of your most useful tools with probability theory:
Know stuff like:
A random variable can never equal an expected value unless the random variable is constant. So if you end up proving something like:
E(X)=Y where X,Y are random variables, you are probably wrong.

Expected value satisfies:
E(aX+bY)=aE(X)+bE(Y)

It DOES NOT SATISFY E(XY)=E(X)E(Y) unless X,Y are independent.

If A is an event, and 1{A} is the indicator function of the event (i.e 1{A} = 1 if event happens, 0 otherwise), then:
E(1{A})=P(A)
This is slightly more advanced, but can often be a useful trick.