# Math Help - Using joint probability mass functions (multiple parts)

1. ## Using joint probability mass functions (multiple parts)

This question has a lot to it, and I'm guessing it has to be done in order (if so, you can't skip ahead a part without completing the prior part). The question(s) are listed below word-for-word.

Let $X$ and $Y$ have the following pmf:
$P(X=i,Y=j)=\frac{\alpha}{(1+i+j)!}, i\geq 0, j\geq 0$
for some $\alpha>0$.

1. Explain without any calculation why $X$ and $Y$ have the same marginal pmf. That is, why $P(X=i)=P(Y=i)$.
2. Let $S=X+Y$. Show that $P(S=k)=\frac{\alpha}{k!}$ for all $k\geq0$. Note that $X$ and $Y$ need not be independent (as will be discussed below).
3. Conclude the value of $\alpha$ and recognize the distribution of $S$. What is the parameter of this distribution?
4. Compute $P(X=0)$. Are $X$ and $Y$ independent?
5. Find $E(S)$, and conclude $E(X)$. HINT: To find $E(X)$ from $E(S)$ you may use part (a) that says $X$ and $Y$ have same distributions and therefore same...
6. Compute $P(X=Y)$. HINT: You can write it as
$P(X=Y)=\sum_{i=0}^\infty P(X=i,Y=i)$,
and think of the infinite expansion of $(e^x-e^{-x})/2$.
7. Conclude $P(X>Y)$. HINT: You may use a symmetry argument.

This is a lot to do, but that's the question as a whole, and I'd rather not make multiple topics pertaining to the same joint pmf.

2. 1.
It is clear from the definition of a marginal pmf that they would be the same, given you are summing either from 0 to infinity of i, or j to calculate either marginal.
$P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)$

2.
Letting $S=X+Y$, to calculate $P(S=k)$ we note;
For:
$k=0 \implies X=0,Y=0 \text{ so } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}$
Where by the definition of factorials $0! = 1!$

$k=1 \implies X=0,Y=1;X=1,Y=0 \text{ so } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}
$

Since $P(X=1,Y=0)=P(X=0,Y=1)$

$k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{ so } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\f rac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}$

So iterating for all k, we end up with the given formula.

3.
For any pmf, we know that the sum over all possible values equals 1. So we can solve for $\alpha$
$1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot e^1$
Thefore, $\alpha=e^{-1}$, so S is poison distributed with paramater $\lambda=1$

4.
$P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}
$

By the definition of independence;
$P(X=0,Y=0)=P(X=0)P(Y=0)$
LHS: $P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679$
RHS: $P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS}$ as X and Y have the same distribution.
Hence they are not independent

5.
Since $S \sim \text{Poi}(1)$ its expected value is $\lambda=1$

Using the fact from (a), X and Y have the same distribution and hence same expected value;
$E(S)=1=E(X)+E(Y)=2\cdot E(X)$
Therefore, $E(X)=0.5$

3. Originally Posted by Robb
1.
It is clear from the definition of a marginal pmf that they would be the same, given you are summing either from 0 to infinity of i, or j to calculate either marginal.
$P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)$

2.
Letting $S=X+Y$, to calculate $P(S=k)$ we note;
For:
$k=0 \implies X=0,Y=0 \text{ so } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}$
Where by the definition of factorials $0! = 1!$

$k=1 \implies X=0,Y=1;X=1,Y=0 \text{ so } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}
$

Since $P(X=1,Y=0)=P(X=0,Y=1)$

$k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{ so } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\f rac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}$

So iterating for all k, we end up with the given formula.

3.
For any pmf, we know that the sum over all possible values equals 1. So we can solve for $\alpha$
$1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot e^1$
Thefore, $\alpha=e^{-1}$, so S is poison distributed with paramater $\lambda=1$

4.
$P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}
$

By the definition of independence;
$P(X=0,Y=0)=P(X=0)P(Y=0)$
LHS: $P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679$
RHS: $P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS}$ as X and Y have the same distribution.
Hence they are not independent

5.
Since $S \sim \text{Poi}(1)$ its expected value is $\lambda=1$

Using the fact from (a), X and Y have the same distribution and hence same expected value;
$E(S)=1=E(X)+E(Y)=2\cdot E(X)$
Therefore, $E(X)=0.5$
Very helpful stuff, thanks. However, I'm still trying to solve the last two. I must be missing something right under my nose.

I've got some stuff here for #6, but I'm wondering if I made a mistake somewhere. The hint on "infinite expansion of $\frac{e^x-e^{-x}}{2}$ is quite strange.
$P(X=Y)=\sum_{i=0}^\infty P(X=i,Y=i)=\sum_{i=0}^\infty \frac{\alpha}{(1+i+i)!} =\sum_{i=0}^\infty \frac{e^{-1}}{(1+2i)!}=e^{-1} \sum_{i=1}^\infty \frac{1}{2!i!}=e^{-1} \left( \frac{e-1}{2} \right) =\frac{1-e^{-1}}{2}$
In case I made a mistake somewhere, if someone could point it out, that'd be great.

As for #7, I'm afraid I've drawn a blank on what to do there. Would I try something like $P(X=i+1,Y=i)$ and see if that works?

4. Originally Posted by Runty
Very helpful stuff, thanks. However, I'm still trying to solve the last two. I must be missing something right under my nose.

I've got some stuff here for #6, but I'm wondering if I made a mistake somewhere. The hint on "infinite expansion of $\frac{e^x-e^{-x}}{2}$ is quite strange.
This is from the hyperbolic sine function, $\sum_{n=o}^{\infty}\frac{x^{2n+1}}{(1+2n)!}=\frac{ e^x-e^{-x}}{2}$

$\sum_{i=0}^\infty \frac{e^{-1}}{(1+2i)!}=e^{-1} \sum_{i=1}^\infty \frac{1}{2!i!}$
Be careful, as $(2i)!\neq2!i!$

So using this hint, $P(X=Y)=\sum_{i=0}^\infty P(X=i,Y=i)=\sum_{i=0}^\infty \frac{\alpha}{(1+i+i)!} =\sum_{i=0}^\infty \frac{e^{-1}}{(1+2i)!}=e^{-1} \cdot \sum_{i=0}^\infty \frac{1}{(1+2i)!}=e^{-1}\left(\frac{e^1-e^{-1}}{2}\right)=0.43233236$

As for #7, I'm afraid I've drawn a blank on what to do there. Would I try something like $P(X=i+1,Y=i)$ and see if that works?
That wont work, as you'd need $\sum_{i=0}^{\infty}\sum_{c=1}^{\infty}P(X=i+c,Y=i)$

Using the hint with symmetry,
$P(X>Y)=0.5(1-P(X=Y))$ since if $X\neq Y$ then it has to be $X>Y$, or $X and by symmetry either will happen 50% of the time.
Therefore; $P(X>Y)=0.5(1-0.43233236)=0.283834$