# Math Help - Transforming Random Variables

1. ## Transforming Random Variables

Let X1 and X2 be independent chi-square random variables with r1 and r2 degrees of freedom, respectively. Show that:

a) U = X1/(X2 + X1) has a beta distribution with alpha = r1/2 and beta = r2/2

b) V = X2/(X2 + X1) has a beta distribution with alpha =r2/2 and B = r1/2

How do you even start this? I need a lot of help with this one.

2. Originally Posted by wolverine21
Let X1 and X2 be independent chi-square random variables with r1 and r2 degrees of freedom, respectively. Show that:

a) U = X1/(X2 + X1) has a beta distribution with alpha = r1/2 and beta = r2/2

b) V = X2/(X2 + X1) has a beta distribution with alpha =r2/2 and B = r1/2

How do you even start this? I need a lot of help with this one.
I'll start a) (Obviously b) starts in a similar way):

The cdf of U is

$F(u) = \Pr(U \leq u) = \Pr\left( \frac{X_1}{X_2 + X_1} \leq u\right) = \Pr(X_1 \leq uX_1 + uX_2)$

(the inequality remains in the same direction since $X_2 + X_1 \geq 0$)

$= \Pr (X_1 - u X_1 \leq uX_2) = \Pr(uX_2 \geq (1 - u) X_1) = \Pr\left( X_2 \geq \frac{1 - u}{u} \, X_1\right)$.

Case 1: $0 < u < 1$: $F(u) = \int_{x_1 = 0}^{+\infty} \int_{x_2 = \frac{1 - u}{u} \, x_1}^{+\infty} f(x_1) \cdot g(x_2) \, dx_2 \, dx_1$

since the joint pdf of $X_1$ and $X_2$ is $f(x_1) \cdot g(x_2)$ (because $X_1$ and $X_2$ are independent).

The rest is left for you to do. Note that the pdf of U is given by $h(u) = \frac{dF}{du}$.

Case 2: $u > 1$: $F(u) = \int_{x_1 = 0}^{+\infty} \int_{x_2 = 0}^{+\infty} f(x_1) \cdot g(x_2) \, dx_2 \, dx_1 = \int_{x_1 = 0}^{+\infty} f(x_1) \, dx_1 \, \int_{x_2 = 0}^{+\infty} g(x_2) \, dx_2 = 1$

$\Rightarrow h(u) = \frac{dF}{du} = 0$.