See here : http://www.well.ox.ac.uk/~valdar/reference/jacobian.pdf
I have a problem and I am at a loss:
Suppose that Z is a standard normal random variable and Y1 and Y2 are χ2 distributed random variable with υ1 and υ2 degrees of freedom, respectively. Further, assume that Z and Y1 and Y2 are independent.
a) Define W = (Z/SQRT(Y1)). Find E(W) and V(W). What assumptions do you need about the value of υ1?
b) Define U= Y1/Y2 . Find E(U) and V(U). What assumptions do you need about the value of υ1 and υ2?
I know the moment generating functions of Z, Y1 & Y2, but I don't know how I am supposed to find the distribution of W considering we are dividing densities... Any help would be vastly appreciated!
df=degrees of freedom
I figured you were given this problem because you're studying the F and T distributions.
There are three ways to attack this.
1 Derive the densities, which is messy and unnecessary given that you're only asked to obtain the moments.
2 Find the moments via the Y's, which isn't that hard
3 Use the F and T moments and obtain the moments of these two in a minute.
Using the last one...
Now look up the mean and variance of an F, which isn't that hard to derive...
(thats my second suggested technique)
where and isn't that hard to obtain either.