I'm looking at this page: Bernoulli Distribution -- from Wolfram MathWorld

I think I'm missing something simple, but how do you get from equation 24 to 25?

Thanks!

Printable View

- Jan 19th 2011, 12:16 PMreflexBernoulli parameter expectation
I'm looking at this page: Bernoulli Distribution -- from Wolfram MathWorld

I think I'm missing something simple, but how do you get from equation 24 to 25?

Thanks! - Jan 19th 2011, 12:48 PMemakarov
Yes, it looks a little... nonobvious.

The easiest thing is to immediately apply the Binomial theorem: $\displaystyle \displaystyle p\sum_{n=0}^N{N\choose n}p^n(1-p)^{N-n}=p(p+1-p)^N=p$. Maybe they factored $\displaystyle (1-p)^N$ first: $\displaystyle \displaystyle\sum_{n=0}^N{N\choose n}p^n(1-p)^{N-n}=(1-p)^N\sum{N\choose n}\left(\frac{p}{1-p}\right)^n=(1-p)^N\left(1+\frac{p}{1-p}\right)^N=$$\displaystyle \displaystyle(1-p)^N\frac{1}{(1-p)^N}=1$. - Jan 19th 2011, 04:37 PMmatheagle
yes, just pull out the p, the rest sums to one.

- Jan 21st 2011, 11:25 AMreflex
Thanks!

Since it's an expectation of p-hat, shouldn't the first p (or all the p's) be a p-hat term? - Jan 21st 2011, 11:44 AMmatheagle
P is a constant, usually unknow, that's why we estimate it with a random variable... p-hat

$\displaystyle E(\hat p)=p$ and we like that p-hat is unbiased for p.