# Thread: Joint probability distribution of functions of random variables

1. ## Joint probability distribution of functions of random variables

If X and Y are independent gamma random variables with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively. I want to compute the joint density of $U=X+Y$ and $V=\frac{X}{X+Y}$ without using jacobian transformation.

Hint provided is to differentiate the following equation with respect to u and v.

$P(U\leq u, V\leq v)=\iint_{(x,y):-(x+y)\leq u,\frac{x}{x+y}\leq v} f_{X,Y}(x,y)dxdy$

Now how to differentiate the above equation with respect to u and v?

2. ## Re: Joint probability distribution of functions of random variables

Originally Posted by Vinod
If X and Y are independent gamma random variables with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively. I want to compute the joint density of $U=X+Y$ and $V=\frac{X}{X+Y}$ without using jacobian transformation.

Hint provided is to differentiate the following equation with respect to u and v.

$P(U\leq u, V\leq v)=\iint_{(x,y):-(x+y)\leq u,\frac{x}{x+y}\leq v} f_{X,Y}(x,y)dxdy$

Now how to differentiate the above equation with respect to u and v?
the first step is to explicitly define the limits of integration in terms of $u$ and $v$

3. ## Re: Joint probability distribution of functions of random variables

Originally Posted by romsek
the first step is to explicitly define the limits of integration in terms of $u$ and $v$
Hello,
The Joint PDF of X and Y with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively is $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1}y^{\beta-1}}{\Gamma{\alpha}\Gamma{\beta}}$. We can express x= uv and y=u(1-v).Now what should be limits of integration?

4. ## Re: Joint probability distribution of functions of random variables

Originally Posted by Vinod
Hello,
The Joint PDF of X and Y with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively is $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1}y^{\beta-1}}{\Gamma{\alpha}\Gamma{\beta}}$. We can express x= uv and y=u(1-v).Now what should be limits of integration?
Now, $u=x+y,\rightarrow du=dx+dy$ and $v=\frac{x}{x+y} \rightarrow dv=\frac{y}{(x+y)^2}dx-\frac{x}{(x+y)^2}dy$

So $f_{UV}(u,v) dudv=\frac{\lambda^{\alpha+\beta} e^{-\lambda(x+y)}x^{\alpha-1} y^{\beta-1}}{\Gamma{\alpha} \Gamma{\beta}}(dx+dy)\bigg(\frac{y}{(x+y)^2}dx -\frac{x}{(x+y)^2}dy\bigg)$

So, finally, we get $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1} y^{\beta-1}(y-x)}{\Gamma{\alpha} \Gamma{\beta} (x+y)^2}dxdy$

Now, How to simplify this in terms of u and v?

5. ## Re: Joint probability distribution of functions of random variables

Originally Posted by romsek
the first step is to explicitly define the limits of integration in terms of $u$ and $v$
you can work out the algebra but what I'm seeing is that

$F_{UV}(u,v) = \displaystyle \int_0^{uv} \int_0^{u(1-v)}~f_{XY}(x,y)~dy~dx$

you want the PDF,

$f_{UV}(u,v) = \dfrac{\partial^2}{\partial u\partial v}~F_{UV}(u,v)$

So you have to differentiate through the integral.

I confess I've never done partials through a double integral before. With a single variable

$\dfrac{d}{du} \displaystyle \int_{a(u)}^{b(u)}~f(x)~dx = \dfrac{db}{du} f(b(u))-\dfrac{da}{du}f(a(u))$

but I'm unsure how to extend this to partials of a double integral. There is something called Leibniz's Rule that is apparently applied.

I'm sure one of the math gurus here can explain how this is done.

6. ## Re: Joint probability distribution of functions of random variables

Originally Posted by romsek
you can work out the algebra but what I'm seeing is that

$F_{UV}(u,v) = \displaystyle \int_0^{uv} \int_0^{u(1-v)}~f_{XY}(x,y)~dy~dx$

you want the PDF,

$f_{UV}(u,v) = \dfrac{\partial^2}{\partial u\partial v}~F_{UV}(u,v)$

So you have to differentiate through the integral.

I confess I've never done partials through a double integral before. With a single variable

$\dfrac{d}{du} \displaystyle \int_{a(u)}^{b(u)}~f(x)~dx = \dfrac{db}{du} f(b(u))-\dfrac{da}{du}f(a(u))$

but I'm unsure how to extend this to partials of a double integral. There is something called Leibniz's Rule that is apparently applied.

I'm sure one of the math gurus here can explain how this is done.
Mathematica gives this mess..

$\Large \begin{array}{cc} \{ & \begin{array}{cc} \frac{\left(\frac{1}{\lambda }\right)^{-\alpha -\beta } \lambda ^{-\alpha -\beta -1} e^{-\frac{u (v+1)}{\lambda }} \left(u v^2 e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)-u v^2 e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } \Gamma \left(\alpha ,\frac{u v}{\lambda }\right)+\lambda \left(-e^{\frac{u v}{\lambda }}\right) \left(\frac{u v}{\lambda }\right)^{\alpha } \left(\frac{u-u v}{\lambda }\right)^{\beta }+2 \lambda v e^{\frac{u v}{\lambda }} \left(\frac{u v}{\lambda }\right)^{\alpha } \left(\frac{u-u v}{\lambda }\right)^{\beta }-(v-1) \Gamma (\beta ) e^{u/\lambda } (u v-\alpha \lambda ) \left(\frac{u v}{\lambda }\right)^{\alpha }+v \Gamma (\alpha ) e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } (\beta \lambda +u (v-1))-u v e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)+\alpha \lambda e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)-\alpha \lambda v e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)+u v e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } \Gamma \left(\alpha ,\frac{u v}{\lambda }\right)-\beta \lambda v e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } \Gamma \left(\alpha ,\frac{u v}{\lambda }\right)\right)}{u (v-1) v \Gamma (\alpha ) \Gamma (\beta )} & v<1 \\ 0 & \text{True} \\ \end{array} \\ \end{array}$

7. ## Re: Joint probability distribution of functions of random variables

Hello,
In my post #4 , I gave wrong answer in terms of x and y. Now, in this post, i give correct final answer in terms of u and v about the joint density of U and V.

$f_{U,V}(u,v) dudv=\frac{\lambda e^{-\lambda u}(\lambda u)^{\alpha+\beta-1}}{\Gamma{(\alpha+\beta)}}$ $\frac{v^{\alpha-1}(1-v)^{\beta-1}\Gamma{(\alpha+\beta)}}{\Gamma {(\alpha)}\Gamma{(\beta)}}$

Hence, U and V are independent, with U having a gamma distribution with parameters $(\alpha +\beta,\lambda)$ and V having probability density function

$f_{V}(v)=\frac{\Gamma{(\alpha+\beta)} v^{\alpha-1}(1-v)^{\beta-1}}{\Gamma {(\alpha)}\Gamma{(\beta)}}, \; \; \; 0 < v < 1$. This is called beta density with parameters $(\alpha, \beta).$

This is the same answer as we could get if we had used the jacobian transformation.

8. ## Re: Joint probability distribution of functions of random variables

Originally Posted by Vinod
Now, $u=x+y,\rightarrow du=dx+dy$ and $v=\frac{x}{x+y} \rightarrow dv=\frac{y}{(x+y)^2}dx-\frac{x}{(x+y)^2}dy$

So $f_{UV}(u,v) dudv=\frac{\lambda^{\alpha+\beta} e^{-\lambda(x+y)}x^{\alpha-1} y^{\beta-1}}{\Gamma{\alpha} \Gamma{\beta}}(dx+dy)\bigg(\frac{y}{(x+y)^2}dx -\frac{x}{(x+y)^2}dy\bigg)$

So, finally, we get $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1} y^{\beta-1}(y-x)}{\Gamma{\alpha} \Gamma{\beta} (x+y)^2}dxdy$

Now, How to simplify this in terms of u and v?
Hello,

Please read finally we get $\frac{\lambda^{\alpha +\beta} e^{-\lambda(x+y)} x^{\alpha-1} y^{\beta-1}(x+y)}{\Gamma{(\alpha)}\Gamma{(\beta)} (x+y)^2}dxdy$

Final correct answer in terms of u and v is given in #7 of this thread.