1. ## Independence

First I would like to clear up some misunderstanding I have about independence. In my text I have:
If $\displaystyle F(x,y)=F_X(x)F_Y(y)$, then X and Y are indep.
However, in my notes from class I have:
If $\displaystyle f(x,y)=f_X(x)f_Y(y)$, then X and Y are indep.
Which is correct?

Is this the only way to show independence? Does it have anything to do with expectation values? I had a question in which I had:
$\displaystyle X=Z, Y=Z^2; Z \sim N(\mu,\sigma^2)$
I was first asked to find the followings: $\displaystyle <X>,<Y>,<XY>$ and here is what I did:
$\displaystyle <X>=<Z>=\mu$
$\displaystyle <Y>=<Z^2>=Var(Z)+<Z>^2=\sigma^2+\mu^2$
$\displaystyle <XY>=<Z^3>$, for this I took the 3rd derivative of the moment generating function and got this expression: $\displaystyle 2\sigma^2\mu+\mu^3$

I was then asked about whether X and Y are independent. Naturally I attempted to use what I just did: $\displaystyle <XY>\neq<X><Y>$ therefore dependent. Is this correct?

2. $\displaystyle Y=X^2$ so they are clearly dependent.
It certainly doesn't matter if you have normality or not.
If you differentiate $\displaystyle F_X(x)F_Y(y)=F(x,y)$ wrt x and y you get $\displaystyle f_X(x)f_Y(y)=f(x,y)$
Likewise if $\displaystyle f_X(x)f_Y(y)=f(x,y)$, then
$\displaystyle F(x,y)=\int_{-\infty}^x\int_{-\infty}^yf(s,t)dtds =\int_{-\infty}^xf_X(s)ds\int_{-\infty}^yf_Y(t)dt=F_X(x)F_Y(y)$

3. Okay, I can see why they are dependent, but how can I show it? I can write:
$\displaystyle f_X(x)=\frac{1}{\sigma\sqrt{2\pi}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}$

But what is $\displaystyle f_Y(y)$?
Is it just $\displaystyle [f_X(x)]^2$, so $\displaystyle f_Y(y)=\frac{1}{\sigma^22\pi}e^{-2\frac{(y-\mu)^2}{2\sigma^2}}$ ?
Or since $\displaystyle Y=X^2$, is it just $\displaystyle f_Y(y)=\frac{1}{\sigma\sqrt{2\pi}}e^{-\frac{(\sqrt{y}-\mu)^2}{2\sigma^2}}$ ?

And worse, how can I combine them and get $\displaystyle f(x,y)$?
I'm quite lost.

4. You can't square the density of X to get the density of $\displaystyle X^2$
Try showing that the covariance is not zero.
Show that $\displaystyle E(XY)-E(X)E(Y)=E(X^3)-E(X)E(X^2)$ is not zero.

5. I know that is $\displaystyle X$ and $\displaystyle Y$ are independent, then $\displaystyle Cov(X,Y)=0$. But I thought it doesn't work the other way around, namely, even if $\displaystyle Cov(X,Y)=0$, it doesn't necessary mean that they are independent. Then even if I show that $\displaystyle Cov(X,Y)\neq0$, how can I say that they are indeed dependent?

6. Originally Posted by synclastica_86
I know that is $\displaystyle X$ and $\displaystyle Y$ are independent, then $\displaystyle Cov(X,Y)=0$. But I thought it doesn't work the other way around, namely, even if $\displaystyle Cov(X,Y)=0$, it doesn't necessary mean that they are independent. Then even if I show that $\displaystyle Cov(X,Y)\neq0$, how can I say that they are indeed dependent?

Yes, that's called the contrapositive.
A implies B, and not B implies not A.

7. Originally Posted by matheagle
You can't square the density of X to get the density of $\displaystyle X^2$
But in principle, with the information given, can I find $\displaystyle f_Y(y)$ and $\displaystyle f(x,y)$ , and therefore, their respective distribution functions?

8. Originally Posted by synclastica_86
But in principle, with the information given, can I find $\displaystyle f_Y(y)$ and $\displaystyle f(x,y)$ , and therefore, their respective distribution functions?

The formula for obtaining the density of $\displaystyle Y=X^2$ is right under

Derivation of the pdf for one degree of freedom
on Chi-square distribution - Wikipedia, the free encyclopedia
But X and Y do not have a joint distribution in $\displaystyle R^2$.