Finding expectation through conditioning

A coin, having probability p of coming up heads is successively flipped until at least one head and one tail have been flipped.

(a)Find the expected number of flips needed.

X=1, if the first flip results in heads

X=2, if the first flip results in tails

N: the total number of flips needed

$\displaystyle E[N]=E[N|X=1,h](P(X=1))+E[N|X=1,t](P(X=1))$$\displaystyle +E[N|X=0,h](P(X=0))+E[N|X=0,t](P(X=0))$

$\displaystyle E[N|X=1,h]\frac{1}{p}+E[N]$

$\displaystyle E[N|X=1,t]\frac{1}{p}+1$

$\displaystyle E[N|X=0,h]\frac{1}{1-p}+1$

$\displaystyle E[N|X=0,t]\frac{1}{1-p}+E[N]$

$\displaystyle E[N]=(\frac{1}{p}+E[N])(p)+(\frac{1}{p}+1)(p)$$\displaystyle +(\frac{1}{1-p}+1)(1-p)+(\frac{1}{1-p}+E[N])(1-p)$

$\displaystyle 1+pE[N]+1+p+\frac{1}{1-p}-\frac{p}{1-p}+1-p+\frac{1}{1-p}-\frac{p}{1-p}+E[N]-pE[N]$

$\displaystyle 2+\frac{2}{1-p}-\frac{2p}{1-p}+E[N]=E[N]$

At this point I'm obviously not able to solve for E[N] because they cancel each other out. What am I doing wrong? I've seriously been looking at this problem for four hours now.

There's another way that I just thought of doing the problem..

$\displaystyle E[N]=E[N|X=1]P(X=1)+E[N|X=0]P(X=0)$

$\displaystyle E[N|X=1]=1+\frac{1}{1-p}$

$\displaystyle E[N|X=0]=1+\frac{1}{p}$

$\displaystyle E[N]=(1+\frac{1}{1-p})(p)+(1+\frac{1}{p})(1-p)$

$\displaystyle =p+\frac{p}{1-p}+1-p+\frac{1}{p}-1=\frac{p}{1-p}+\frac{1}{p}=\frac{p^{2}+1-p}{p(1-p)}$

But I don't have the solution to the problem, so I can't be sure it's correct. Although I'm not very confident it is anyway..

Re: Finding expectation through conditioning

So if your first flip is a H, then you're waiting for a T. And if your first flip is a T, then you're waiting for a H. In either case the number of additional flips needed follows a geometric distribution (but with different values for "p"). Your second approach seems correct to me.

Re: Finding expectation through conditioning

Quote:

Originally Posted by

**Random Variable** So if your first flip is a H, then you're waiting for a T. And if your first flip is a T, then you're waiting for a H. In either case the number of additional flips needed follows a geometric distribution (but with different values for "p"). Your second approach seems correct to me.

Great, that's the same line of thought I was using for the problem.