Thread: Prove that Y1 and Y2 are independent

1. Prove that Y1 and Y2 are independent

Let X1 and X2 be a random sample of size 2 from a distribution with probability density function f(x) = exp(-x) , 0 <x < infinity. and f(x)= 0 elsewhere.

Now, let Y1= X1+ X2.
Y2 = X1/(X1+X2).

Prove that Y2 and Y2 are independent.

2. Originally Posted by cryptic26
Let X1 and X2 be a random sample of size 2 from a distribution with probability density function f(x) = exp(-x) , 0 <x < infinity. and f(x)= 0 elsewhere.

Now, let Y1= X1+ X2.
Y2 = X1/(X1+X2).

Prove that Y2 and Y2 are independent.
The joint pdf of $X_1$ and $X_2$ is $f(x_1, x_2) = e^{-x_1} e^{-x_2}$.

$X_1 = Y_1 Y_2$ and $X_2 = Y_1 (1 - Y_2)$. Therefore $|J| = Y_1$.

Therefore the joint pdf of $Y_1$ and $Y_2$ is $g(y_1, y_2) = f(x_1(y_1, y_2), x_2(y_1, y_2)) \, J = \, ....$

Now show that $g(y_1, y_2)$ can be written as a product of the form $h_1(y_1) \cdot h_2(y_2)$.

Aside: If $Y_1$ and $Y_2$ are independent then $Cov(Y_1, Y_2) = 0$. The converse is NOT true.

$Cov(Y_1, Y_2) = E(Y_1 Y_2) - E(Y_1) E(Y_2) = E(X_1) - [E(x_1) + E(x_2)] E\left(\frac{X_1}{X_1 + X_2} \right)$ $= 1 - (1 + 1) E\left(\frac{X_1}{X_1 + X_2} \right) = 1 - (2) \left(\frac{1}{2} \right) = 0$, as expected.

Note: $I = \int_0^{+\infty} \int_0^{+\infty} \frac{x_1}{x_1 + x_2} e^{-x_1} e^{-x_2} \, dx_1 \, dx_2 = \int_0^{+\infty} \int_0^{+\infty} e^{-x_1} e^{-x_2} \, dx_1 \, dx_2 - I$.

Therefore $E\left(\frac{X_1}{X_1 + X_2} \right) = I = \frac{1}{2}$.

3. Thanks!

4. Originally Posted by cryptic26
Let X1 and X2 be a random sample of size 2 from a distribution with probability density function f(x) = exp(-x) , 0 <x < infinity. and f(x)= 0 elsewhere.

Now, let Y1= X1+ X2.
Y2 = X1/(X1+X2).

Prove that Y2 and Y2 are independent.
Anyone wishing to contribute to this thread can pm me. In light of another thread being completely vandalised by edit-deletes, I'm closing this thread.