Results 1 to 8 of 8

Thread: Joint probability distribution of functions of random variables

  1. #1
    Senior Member Vinod's Avatar
    Joined
    Sep 2011
    From
    I live here
    Posts
    381
    Thanks
    7

    Joint probability distribution of functions of random variables

    If X and Y are independent gamma random variables with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively. I want to compute the joint density of $U=X+Y$ and $V=\frac{X}{X+Y}$ without using jacobian transformation.

    Hint provided is to differentiate the following equation with respect to u and v.

    $P(U\leq u, V\leq v)=\iint_{(x,y):-(x+y)\leq u,\frac{x}{x+y}\leq v} f_{X,Y}(x,y)dxdy$

    Now how to differentiate the above equation with respect to u and v?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Nov 2013
    From
    California
    Posts
    6,419
    Thanks
    2788

    Re: Joint probability distribution of functions of random variables

    Quote Originally Posted by Vinod View Post
    If X and Y are independent gamma random variables with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively. I want to compute the joint density of $U=X+Y$ and $V=\frac{X}{X+Y}$ without using jacobian transformation.

    Hint provided is to differentiate the following equation with respect to u and v.

    $P(U\leq u, V\leq v)=\iint_{(x,y):-(x+y)\leq u,\frac{x}{x+y}\leq v} f_{X,Y}(x,y)dxdy$

    Now how to differentiate the above equation with respect to u and v?
    the first step is to explicitly define the limits of integration in terms of $u$ and $v$
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Senior Member Vinod's Avatar
    Joined
    Sep 2011
    From
    I live here
    Posts
    381
    Thanks
    7

    Re: Joint probability distribution of functions of random variables

    Quote Originally Posted by romsek View Post
    the first step is to explicitly define the limits of integration in terms of $u$ and $v$
    Hello,
    The Joint PDF of X and Y with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively is $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1}y^{\beta-1}}{\Gamma{\alpha}\Gamma{\beta}}$. We can express x= uv and y=u(1-v).Now what should be limits of integration?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member Vinod's Avatar
    Joined
    Sep 2011
    From
    I live here
    Posts
    381
    Thanks
    7

    Re: Joint probability distribution of functions of random variables

    Quote Originally Posted by Vinod View Post
    Hello,
    The Joint PDF of X and Y with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively is $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1}y^{\beta-1}}{\Gamma{\alpha}\Gamma{\beta}}$. We can express x= uv and y=u(1-v).Now what should be limits of integration?
    Now, $u=x+y,\rightarrow du=dx+dy$ and $v=\frac{x}{x+y} \rightarrow dv=\frac{y}{(x+y)^2}dx-\frac{x}{(x+y)^2}dy$

    So $f_{UV}(u,v) dudv=\frac{\lambda^{\alpha+\beta} e^{-\lambda(x+y)}x^{\alpha-1} y^{\beta-1}}{\Gamma{\alpha} \Gamma{\beta}}(dx+dy)\bigg(\frac{y}{(x+y)^2}dx -\frac{x}{(x+y)^2}dy\bigg)$

    So, finally, we get $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1} y^{\beta-1}(y-x)}{\Gamma{\alpha} \Gamma{\beta} (x+y)^2}dxdy$

    Now, How to simplify this in terms of u and v?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor
    Joined
    Nov 2013
    From
    California
    Posts
    6,419
    Thanks
    2788

    Re: Joint probability distribution of functions of random variables

    Quote Originally Posted by romsek View Post
    the first step is to explicitly define the limits of integration in terms of $u$ and $v$
    you can work out the algebra but what I'm seeing is that

    $F_{UV}(u,v) = \displaystyle \int_0^{uv} \int_0^{u(1-v)}~f_{XY}(x,y)~dy~dx$

    you want the PDF,

    $f_{UV}(u,v) = \dfrac{\partial^2}{\partial u\partial v}~F_{UV}(u,v)$

    So you have to differentiate through the integral.

    I confess I've never done partials through a double integral before. With a single variable

    $\dfrac{d}{du} \displaystyle \int_{a(u)}^{b(u)}~f(x)~dx = \dfrac{db}{du} f(b(u))-\dfrac{da}{du}f(a(u))$

    but I'm unsure how to extend this to partials of a double integral. There is something called Leibniz's Rule that is apparently applied.

    I'm sure one of the math gurus here can explain how this is done.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor
    Joined
    Nov 2013
    From
    California
    Posts
    6,419
    Thanks
    2788

    Re: Joint probability distribution of functions of random variables

    Quote Originally Posted by romsek View Post
    you can work out the algebra but what I'm seeing is that

    $F_{UV}(u,v) = \displaystyle \int_0^{uv} \int_0^{u(1-v)}~f_{XY}(x,y)~dy~dx$

    you want the PDF,

    $f_{UV}(u,v) = \dfrac{\partial^2}{\partial u\partial v}~F_{UV}(u,v)$

    So you have to differentiate through the integral.

    I confess I've never done partials through a double integral before. With a single variable

    $\dfrac{d}{du} \displaystyle \int_{a(u)}^{b(u)}~f(x)~dx = \dfrac{db}{du} f(b(u))-\dfrac{da}{du}f(a(u))$

    but I'm unsure how to extend this to partials of a double integral. There is something called Leibniz's Rule that is apparently applied.

    I'm sure one of the math gurus here can explain how this is done.
    Mathematica gives this mess..

    $\Large \begin{array}{cc}
    \{ &
    \begin{array}{cc}
    \frac{\left(\frac{1}{\lambda }\right)^{-\alpha -\beta } \lambda ^{-\alpha -\beta -1} e^{-\frac{u (v+1)}{\lambda }} \left(u v^2 e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)-u v^2 e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } \Gamma \left(\alpha ,\frac{u v}{\lambda }\right)+\lambda \left(-e^{\frac{u v}{\lambda }}\right) \left(\frac{u v}{\lambda }\right)^{\alpha } \left(\frac{u-u v}{\lambda }\right)^{\beta }+2 \lambda v e^{\frac{u v}{\lambda }} \left(\frac{u v}{\lambda }\right)^{\alpha } \left(\frac{u-u v}{\lambda }\right)^{\beta }-(v-1) \Gamma (\beta ) e^{u/\lambda } (u v-\alpha \lambda ) \left(\frac{u v}{\lambda }\right)^{\alpha }+v \Gamma (\alpha ) e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } (\beta \lambda +u (v-1))-u v e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)+\alpha \lambda e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)-\alpha \lambda v e^{u/\lambda } \left(\frac{u v}{\lambda }\right)^{\alpha } \Gamma \left(\beta ,\frac{u-u v}{\lambda }\right)+u v e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } \Gamma \left(\alpha ,\frac{u v}{\lambda }\right)-\beta \lambda v e^{\frac{2 u v}{\lambda }} \left(\frac{u-u v}{\lambda }\right)^{\beta } \Gamma \left(\alpha ,\frac{u v}{\lambda }\right)\right)}{u (v-1) v \Gamma (\alpha ) \Gamma (\beta )} & v<1 \\
    0 & \text{True} \\
    \end{array}
    \\
    \end{array}$
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Senior Member Vinod's Avatar
    Joined
    Sep 2011
    From
    I live here
    Posts
    381
    Thanks
    7

    Re: Joint probability distribution of functions of random variables

    Hello,
    In my post #4 , I gave wrong answer in terms of x and y. Now, in this post, i give correct final answer in terms of u and v about the joint density of U and V.

    $f_{U,V}(u,v) dudv=\frac{\lambda e^{-\lambda u}(\lambda u)^{\alpha+\beta-1}}{\Gamma{(\alpha+\beta)}}$ $\frac{v^{\alpha-1}(1-v)^{\beta-1}\Gamma{(\alpha+\beta)}}{\Gamma {(\alpha)}\Gamma{(\beta)}}$

    Hence, U and V are independent, with U having a gamma distribution with parameters $(\alpha +\beta,\lambda)$ and V having probability density function

    $f_{V}(v)=\frac{\Gamma{(\alpha+\beta)} v^{\alpha-1}(1-v)^{\beta-1}}{\Gamma {(\alpha)}\Gamma{(\beta)}}, \; \; \; 0 < v < 1$. This is called beta density with parameters $(\alpha, \beta).$

    This is the same answer as we could get if we had used the jacobian transformation.
    Last edited by Vinod; Jan 21st 2019 at 08:02 PM.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Senior Member Vinod's Avatar
    Joined
    Sep 2011
    From
    I live here
    Posts
    381
    Thanks
    7

    Re: Joint probability distribution of functions of random variables

    Quote Originally Posted by Vinod View Post
    Now, $u=x+y,\rightarrow du=dx+dy$ and $v=\frac{x}{x+y} \rightarrow dv=\frac{y}{(x+y)^2}dx-\frac{x}{(x+y)^2}dy$

    So $f_{UV}(u,v) dudv=\frac{\lambda^{\alpha+\beta} e^{-\lambda(x+y)}x^{\alpha-1} y^{\beta-1}}{\Gamma{\alpha} \Gamma{\beta}}(dx+dy)\bigg(\frac{y}{(x+y)^2}dx -\frac{x}{(x+y)^2}dy\bigg)$

    So, finally, we get $\frac{\lambda^{\alpha+\beta} e^{-\lambda (x+y)} x^{\alpha-1} y^{\beta-1}(y-x)}{\Gamma{\alpha} \Gamma{\beta} (x+y)^2}dxdy$

    Now, How to simplify this in terms of u and v?
    Hello,

    Please read finally we get $\frac{\lambda^{\alpha +\beta} e^{-\lambda(x+y)} x^{\alpha-1} y^{\beta-1}(x+y)}{\Gamma{(\alpha)}\Gamma{(\beta)} (x+y)^2}dxdy$

    Final correct answer in terms of u and v is given in #7 of this thread.
    Last edited by Vinod; Jan 22nd 2019 at 03:37 AM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Joint Distribution of iid Random Variables
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Mar 5th 2016, 01:51 PM
  2. Replies: 2
    Last Post: Mar 23rd 2013, 05:35 PM
  3. Replies: 1
    Last Post: Mar 21st 2013, 03:59 AM
  4. Replies: 3
    Last Post: Aug 13th 2011, 01:10 PM
  5. Replies: 0
    Last Post: Feb 7th 2010, 08:56 PM

/mathhelpforum @mathhelpforum