1. ## Bivariate random variable

First the question:

Let $\displaystyle X$ and $\displaystyle Y$ be two independent variables. If $\displaystyle X$ and $\displaystyle Y$ are both standard normal, then what is the distribution of $\displaystyle \frac{1}{2} (X^2-Y^2)$?

So I've tried to use the fact that X and Y are standard normal to tell me that $\displaystyle \mu_x=\mu_y=0$ and $\displaystyle {\sigma}_{x}=\sigma_y=1$.
So,
$\displaystyle {\sigma}^2=E[X^2]-E[X]^2$
$\displaystyle 1=E[X^2]-0^2 \implies E[X^2]=1$

I found that $\displaystyle E[\frac{1}{2} (X^2-Y^2)]= \frac{1}{2} (E[X^2] -E[Y^2]) = \frac{1}{2}(1^2-1^2) = 0$

I'm not really sure what to do with this.
Maybe I am wrongly assuming something.

2. Hello,
Originally Posted by NoFace
First the question:

Let $\displaystyle X$ and $\displaystyle Y$ be two independent variables. If $\displaystyle X$ and $\displaystyle Y$ are both standard normal, then what is the distribution of $\displaystyle \frac{1}{2} (X^2-Y^2)$?

So I've tried to use the fact that X and Y are standard normal to tell me that $\displaystyle \mu_x=\mu_y=0$ and $\displaystyle {\sigma}_{x}=\sigma_y=1$.
So,
$\displaystyle {\sigma}^2=E[X^2]-E[X]^2$
$\displaystyle 1=E[X^2]-0^2 \implies E[X^2]=1$

I found that $\displaystyle E[\frac{1}{2} (X^2-Y^2)]= \frac{1}{2} (E[X^2] -E[Y^2]) = \frac{1}{2}(1^2-1^2) = 0$

I'm not really sure what to do with this.
Maybe I am wrongly assuming something.

Unfortunately, expectations and variances are not characteristic of a distribution !

Have you ever dealt with problems that required finding the pdf of a function of X and Y (e.g. $\displaystyle \tfrac 12(X^2-Y^2)$), while given the pdf of X and Y ?

3. Originally Posted by Moo
Have you ever dealt with problems that required finding the pdf of a function of X and Y (e.g. $\displaystyle \tfrac 12(X^2-Y^2)$), while given the pdf of X and Y ?
Unfortunately, no.

I began using the fact that $\displaystyle M_{aX+bY}(t)=M_X(at)M_Y(bt)$ to find the moment generating function then match that up, but the fact that I am working with $\displaystyle X^2$ and $\displaystyle Y^2$ is tripping me up.

4. Both $\displaystyle X^2$ and $\displaystyle Y^2$ are $\displaystyle \chi^2$ random variables. Their difference has support on all of $\displaystyle R$, so it's not a gamma or anything obvious. Hence you need to play with the densities. NOW the sum of $\displaystyle X^2$ and $\displaystyle Y^2$ is a different matter.

5. Figured it out. After consulting with the professor, we established that the answer in the back of the book is wrong. Way wrong.

Thanks.

6. ## Solution if anyone is interested

Here is the solution:
$\displaystyle X^2$ and $\displaystyle Y^2$ are $\displaystyle \chi^2$ random variables, so we already know the moment generating functions $\displaystyle M_{X^2}(t)$ and $\displaystyle M_{Y^2}(t)$.

$\displaystyle M_{\frac{1}{2}(X^2-Y^2)}(t)$$\displaystyle \\ \\ = E\big[e^{\frac{t}{2}(X^2-Y^2)}\big]$

$\displaystyle \\ \\ = E\big[e^{\frac{t}{2}X^2}]E\big[e^{\frac{-t}{2}Y^2}\big]$
$\displaystyle \\ \\ = M_{X^2}\bigg(\frac{t}{2}\bigg)M_{Y^2}\bigg(\frac{-t}{2}\bigg)$
$\displaystyle \\ \\ = \big((1-t)^{-\frac{1}{2}}\big)\big((1+t)^{-\frac{1}{2}}\big)$
$\displaystyle \\ \\ = \big((1-t^2)^{-\frac{1}{2}}\big)$

Which is the MGF of a function which has no name and is certainly not what my book says.

7. You can derive the density, but as I said before the MGF technique won't help you recognize the probability distribution.