# Thread: maximum likelihood estimator biased not unbiased

1. ## maximum likelihood estimator biased not unbiased

a)derive the likelihood estimator of the geometric function $f(x;\theta)=\theta(1-\theta)^{x-1}, x=1,2,3$

let $x_{1},,,,x_{n}$ represent number of attacks,,

b)determine the cramer-rao lower bound for an unbiased estimator of $\Theta$

I have some answers for part a) which gave the MLE as $\frac{n}{\sum x_i}=\frac{1}{\overline{x}}=\frac{1}{E(x)}$ which would work for part b) as it is unbiased...for it to be unbiased the proof is that either MSE(estimate)=var(estimate) or BIAS(estimate)=0 implies E(estimate)=0. hence $E(\frac{1}{E(x)}-\Theta)
=\frac{1}{\frac{1}{\theta}}-\Theta)=0$

however I cannot see how this estimator $\frac{1}{E(x)}$ was obtained.
My working is as follows and I obtained something completely different while the official working gave the anser above: any offerings as to how they did it?

$l(\Theta)=\log\Theta^n(1-\Theta)^{\sum x_{i}-n}$
$=(\sum x_{i}-n)log(1-\Theta)+n\log(\Theta)$
$l'(\Theta)=\frac{\sum x_{i}-n}{1-\Theta}+\frac{n}{\Theta}=0$
$=\Theta(\sum x_{i}-n)+n(1-\Theta)=0$
$=\Theta(\sum x_{i}-n)+n-n\Theta=0$
$=\Theta(\sum x_{i}-2n)=-n$
$\Theta=\frac{=n}{\sum x_{i}-2n}$
$=\frac{n}{2n-\sum x_{i}}$

which is clearly biased

2. ??