# Thread: Showing that a max likelihood estimator is unique.

1. ## Showing that a max likelihood estimator is unique.

I'm given that $f(y;\lambda) = \lambda(1-y)^{\lambda-1}$

where $0 < y < 1$ and $\lambda > 0$

I've worked out that the max likelihood est of $\lambda$ based on $Y_{1},...,Y_{N}$ is

$\lambda_{ML} = - \frac{n}{\sum log(1-y_{i})}$

Firstly, is this correct?

Secondly, how would I show that it's unique? Am I right in thinking that if it's unique, as n gets bigger, $\lambda_{ML}$ tends to $\lambda$? If so, how do I show this?

2. Yes, you're right.

You show that it's unique by construction. The MLE is found by computing the log-likelihood and finding its stationary points (where the derivative with respect to lambda is 0), you also have to show that the second derivative is negative, therefore showing that it corresponds to a maximum of the likelihood. In this case there's only one stationary point, and so this is the unique MLE.

How did you find the MLE?

As for the last question, a MLE is unbiased and I know that it is more asymptotically efficient (has a smaller mean squared error) than any other unbiased estimator, but I don't know how to prove it.

3. Why do you think this estimator is unbiased?
It may be asymptotically unbiased.
But it doesn't look unbiased for $\lambda$ to me.

4. Asymptotically unbiased was what I meant, sorry.

5. That's what I thought.
MLEs are not always unbiased, especially with logarithms in the denominator.

6. Originally Posted by pedrosorio
Yes, you're right.

You show that it's unique by construction. The MLE is found by computing the log-likelihood and finding its stationary points (where the derivative with respect to lambda is 0), you also have to show that the second derivative is negative, therefore showing that it corresponds to a maximum of the likelihood. In this case there's only one stationary point, and so this is the unique MLE.

How did you find the MLE?

As for the last question, a MLE is unbiased and I know that it is more asymptotically efficient (has a smaller mean squared error) than any other unbiased estimator, but I don't know how to prove it.
I found the MLE like how you explained. I didn't do the second derivative though, but I've just done it, that comes up as negative It never twigged that it's unique because there's only one stationary point. Thanks.