# Check Answer: Show that two Bernoulli random variables are independent

• Nov 10th 2010, 11:07 AM
Runty
Check Answer: Show that two Bernoulli random variables are independent
I've found a good answer to a particular question I was given, though I'd like a second opinion on whether or not my answer is solid (I might be "begging the question" in this).

Show that two Bernoulli random variables $X$ and $Y$ are independent if and only if $P(X=1,Y=1)=P(X=1)P(Y=1)$.

Two Bernoulli random variables are independent if and only if they are uncorrelated, and thus have a covariance of zero ( $Cov(X,Y)=0$).
Let $p_x$ be the pmf of $X$, and let $p_y$ be the pmf of $Y$.
If $X$ and $Y$ are independent then, by definition,
$Cov(X,Y)=p_{xy}-p_xp_y=P(X=1,Y=1)-P(X=1)P(Y=1)=0$,
as $P(X=1,Y=1)=P(X=1)P(Y=1)$.
If on the other hand we have that $Cov(X,Y)=0$, then
$p_{xy}-p_xp_y=0\Rightarrow p_{xy}=p_xp_y$.
Therefore, $X$ and $Y$ are independent.

Are there any mess-ups I might have made somewhere? I got this answer from this pdf file.
• Nov 10th 2010, 01:56 PM
matheagle
you would just have to show that the other 3 hold as well.

You have $P(X=1,Y=1)=P(X=1)P(Y=1)$

and you will need $P(X=1,Y=0)=P(X=1)P(Y=0)$
$P(X=0,Y=1)=P(X=0)P(Y=1)$
$P(X=0,Y=0)=P(X=0)P(Y=0)$

and they must sum to one via the total and the marginals, so that should help.
• Nov 11th 2010, 10:09 AM
Runty
Quote:

Originally Posted by matheagle
you would just have to show that the other 3 hold as well.

You have $P(X=1,Y=1)=P(X=1)P(Y=1)$

and you will need $P(X=1,Y=0)=P(X=1)P(Y=0)$
$P(X=0,Y=1)=P(X=0)P(Y=1)$
$P(X=0,Y=0)=P(X=0)P(Y=0)$

and they must sum to one via the total and the marginals, so that should help.

Or, as an alternative, one could do it for $P(X=i,Y=j)=P(X=i)P(Y=j)$ where $i,j=0,1$ (this would be a bit faster, yet still means the same thing). Note that I listed the question word-for-word, and the question doesn't say anything about marginal pmfs or totals. Though I assume one would want to show that the sum of the marginals equals 1 for completeness, I don't know if such would be necessary for the purpose of the question.

See, I got my answer from this link here, and the proof it gave looks very solid. Is this source missing some necessary parts?
• Nov 11th 2010, 10:11 AM
Runty
EDIT: This had a doublepost, oops.

But as long as it's here, I figured out what you mean by that "summing" part. Here's what I got from PhysicsForum.

$Cov(X,Y)= E(XY)-E(X)E(Y)$
$= \left(0 \cdot 0 \cdot p(0,0) + 0 \cdot 1 \cdot p(0,1) + 1 \cdot 0 \cdot p(1,0) + 1 \cdot 1 \cdot p(1,1)\right) - \left(0 \cdot p_x(0) + 1 \cdot p_x(1)\right) \left(0 \cdot p_y(0) + 1 \cdot p_y(1)\right)$
$= p(1,1) - p_x(1)p_y(1)$

Using this, I'd go into the stuff I put in earlier and be able to complete the proof.

If there's anything I've missed, let me know.