Let X1 and X2 be a random sample of size 2 from a distribution with probability density function f(x) = exp(-x) , 0 <x < infinity. and f(x)= 0 elsewhere.
Now, let Y1= X1+ X2.
Y2 = X1/(X1+X2).
Prove that Y2 and Y2 are independent.
Let X1 and X2 be a random sample of size 2 from a distribution with probability density function f(x) = exp(-x) , 0 <x < infinity. and f(x)= 0 elsewhere.
Now, let Y1= X1+ X2.
Y2 = X1/(X1+X2).
Prove that Y2 and Y2 are independent.
The joint pdf of $\displaystyle X_1$ and $\displaystyle X_2$ is $\displaystyle f(x_1, x_2) = e^{-x_1} e^{-x_2}$.
$\displaystyle X_1 = Y_1 Y_2$ and $\displaystyle X_2 = Y_1 (1 - Y_2)$. Therefore $\displaystyle |J| = Y_1$.
Therefore the joint pdf of $\displaystyle Y_1$ and $\displaystyle Y_2$ is $\displaystyle g(y_1, y_2) = f(x_1(y_1, y_2), x_2(y_1, y_2)) \, J = \, ....$
Now show that $\displaystyle g(y_1, y_2)$ can be written as a product of the form $\displaystyle h_1(y_1) \cdot h_2(y_2)$.
Aside: If $\displaystyle Y_1$ and $\displaystyle Y_2$ are independent then $\displaystyle Cov(Y_1, Y_2) = 0$. The converse is NOT true.
$\displaystyle Cov(Y_1, Y_2) = E(Y_1 Y_2) - E(Y_1) E(Y_2) = E(X_1) - [E(x_1) + E(x_2)] E\left(\frac{X_1}{X_1 + X_2} \right)$ $\displaystyle = 1 - (1 + 1) E\left(\frac{X_1}{X_1 + X_2} \right) = 1 - (2) \left(\frac{1}{2} \right) = 0$, as expected.
Note: $\displaystyle I = \int_0^{+\infty} \int_0^{+\infty} \frac{x_1}{x_1 + x_2} e^{-x_1} e^{-x_2} \, dx_1 \, dx_2 = \int_0^{+\infty} \int_0^{+\infty} e^{-x_1} e^{-x_2} \, dx_1 \, dx_2 - I$.
Therefore $\displaystyle E\left(\frac{X_1}{X_1 + X_2} \right) = I = \frac{1}{2}$.