Results 1 to 4 of 4

Math Help - Using joint probability mass functions (multiple parts)

  1. #1
    Member
    Joined
    Jan 2010
    Posts
    232

    Using joint probability mass functions (multiple parts)

    This question has a lot to it, and I'm guessing it has to be done in order (if so, you can't skip ahead a part without completing the prior part). The question(s) are listed below word-for-word.

    Let X and Y have the following pmf:
    P(X=i,Y=j)=\frac{\alpha}{(1+i+j)!}, i\geq 0, j\geq 0
    for some \alpha>0.

    1. Explain without any calculation why X and Y have the same marginal pmf. That is, why P(X=i)=P(Y=i).
    2. Let S=X+Y. Show that P(S=k)=\frac{\alpha}{k!} for all k\geq0. Note that X and Y need not be independent (as will be discussed below).
    3. Conclude the value of \alpha and recognize the distribution of S. What is the parameter of this distribution?
    4. Compute P(X=0). Are X and Y independent?
    5. Find E(S), and conclude E(X). HINT: To find E(X) from E(S) you may use part (a) that says X and Y have same distributions and therefore same...
    6. Compute P(X=Y). HINT: You can write it as
      P(X=Y)=\sum_{i=0}^\infty P(X=i,Y=i),
      and think of the infinite expansion of (e^x-e^{-x})/2.
    7. Conclude P(X>Y). HINT: You may use a symmetry argument.

    This is a lot to do, but that's the question as a whole, and I'd rather not make multiple topics pertaining to the same joint pmf.
    Last edited by Runty; November 11th 2010 at 09:25 AM. Reason: Slight fix to question
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Member
    Joined
    Mar 2009
    Posts
    133
    1.
    It is clear from the definition of a marginal pmf that they would be the same, given you are summing either from 0 to infinity of i, or j to calculate either marginal.
     P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)

    2.
    Letting S=X+Y, to calculate P(S=k) we note;
    For:
    k=0 \implies X=0,Y=0 \text{  so  } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}
    Where by the definition of factorials 0! = 1!


    k=1 \implies X=0,Y=1;X=1,Y=0 \text{  so  } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}<br />
    Since  P(X=1,Y=0)=P(X=0,Y=1)

    k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{  so  } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\f  rac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}

    So iterating for all k, we end up with the given formula.


    3.
    For any pmf, we know that the sum over all possible values equals 1. So we can solve for \alpha
    1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot  e^1
    Thefore, \alpha=e^{-1}, so S is poison distributed with paramater \lambda=1


    4.
    P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty  \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}<br />

    By the definition of independence;
    P(X=0,Y=0)=P(X=0)P(Y=0)
    LHS: P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679
    RHS: P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS} as X and Y have the same distribution.
    Hence they are not independent

    5.
    Since S \sim \text{Poi}(1) its expected value is \lambda=1

    Using the fact from (a), X and Y have the same distribution and hence same expected value;
    E(S)=1=E(X)+E(Y)=2\cdot E(X)
    Therefore, E(X)=0.5
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Jan 2010
    Posts
    232
    Quote Originally Posted by Robb View Post
    1.
    It is clear from the definition of a marginal pmf that they would be the same, given you are summing either from 0 to infinity of i, or j to calculate either marginal.
     P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)

    2.
    Letting S=X+Y, to calculate P(S=k) we note;
    For:
    k=0 \implies X=0,Y=0 \text{  so  } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}
    Where by the definition of factorials 0! = 1!


    k=1 \implies X=0,Y=1;X=1,Y=0 \text{  so  } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}<br />
    Since  P(X=1,Y=0)=P(X=0,Y=1)

    k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{  so  } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\f  rac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}

    So iterating for all k, we end up with the given formula.


    3.
    For any pmf, we know that the sum over all possible values equals 1. So we can solve for \alpha
    1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot  e^1
    Thefore, \alpha=e^{-1}, so S is poison distributed with paramater \lambda=1


    4.
    P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty  \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}<br />

    By the definition of independence;
    P(X=0,Y=0)=P(X=0)P(Y=0)
    LHS: P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679
    RHS: P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS} as X and Y have the same distribution.
    Hence they are not independent

    5.
    Since S \sim \text{Poi}(1) its expected value is \lambda=1

    Using the fact from (a), X and Y have the same distribution and hence same expected value;
    E(S)=1=E(X)+E(Y)=2\cdot E(X)
    Therefore, E(X)=0.5
    Very helpful stuff, thanks. However, I'm still trying to solve the last two. I must be missing something right under my nose.

    I've got some stuff here for #6, but I'm wondering if I made a mistake somewhere. The hint on "infinite expansion of \frac{e^x-e^{-x}}{2} is quite strange.
    P(X=Y)=\sum_{i=0}^\infty P(X=i,Y=i)=\sum_{i=0}^\infty \frac{\alpha}{(1+i+i)!} =\sum_{i=0}^\infty \frac{e^{-1}}{(1+2i)!}=e^{-1} \sum_{i=1}^\infty \frac{1}{2!i!}=e^{-1} \left( \frac{e-1}{2} \right) =\frac{1-e^{-1}}{2}
    In case I made a mistake somewhere, if someone could point it out, that'd be great.

    As for #7, I'm afraid I've drawn a blank on what to do there. Would I try something like P(X=i+1,Y=i) and see if that works?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Member
    Joined
    Mar 2009
    Posts
    133
    Quote Originally Posted by Runty View Post
    Very helpful stuff, thanks. However, I'm still trying to solve the last two. I must be missing something right under my nose.

    I've got some stuff here for #6, but I'm wondering if I made a mistake somewhere. The hint on "infinite expansion of \frac{e^x-e^{-x}}{2} is quite strange.
    This is from the hyperbolic sine function, \sum_{n=o}^{\infty}\frac{x^{2n+1}}{(1+2n)!}=\frac{  e^x-e^{-x}}{2}

     \sum_{i=0}^\infty \frac{e^{-1}}{(1+2i)!}=e^{-1} \sum_{i=1}^\infty \frac{1}{2!i!}
    Be careful, as (2i)!\neq2!i!

    So using this hint, P(X=Y)=\sum_{i=0}^\infty P(X=i,Y=i)=\sum_{i=0}^\infty \frac{\alpha}{(1+i+i)!} =\sum_{i=0}^\infty \frac{e^{-1}}{(1+2i)!}=e^{-1} \cdot \sum_{i=0}^\infty \frac{1}{(1+2i)!}=e^{-1}\left(\frac{e^1-e^{-1}}{2}\right)=0.43233236

    As for #7, I'm afraid I've drawn a blank on what to do there. Would I try something like P(X=i+1,Y=i) and see if that works?
    That wont work, as you'd need \sum_{i=0}^{\infty}\sum_{c=1}^{\infty}P(X=i+c,Y=i)

    Using the hint with symmetry,
    P(X>Y)=0.5(1-P(X=Y)) since if X\neq Y then it has to be X>Y, or X<Y and by symmetry either will happen 50% of the time.
    Therefore; P(X>Y)=0.5(1-0.43233236)=0.283834
    Last edited by Robb; November 16th 2010 at 06:01 AM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Joint Probability mass problem
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: December 7th 2010, 04:33 AM
  2. joint Probability mass function
    Posted in the Statistics Forum
    Replies: 2
    Last Post: August 7th 2010, 09:07 PM
  3. Joint Probability Mass Functions
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: May 5th 2010, 06:07 AM
  4. Joint probability mass functions help
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: September 28th 2009, 04:32 AM
  5. joint probability mass function
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: March 31st 2008, 05:42 AM

Search Tags


/mathhelpforum @mathhelpforum