Showing that a max likelihood estimator is unique.

• Aug 22nd 2009, 05:42 AM
Zenter
Showing that a max likelihood estimator is unique.
I'm given that $\displaystyle f(y;\lambda) = \lambda(1-y)^{\lambda-1}$

where $\displaystyle 0 < y < 1$ and $\displaystyle \lambda > 0$

I've worked out that the max likelihood est of $\displaystyle \lambda$ based on $\displaystyle Y_{1},...,Y_{N}$ is

$\displaystyle \lambda_{ML} = - \frac{n}{\sum log(1-y_{i})}$

Firstly, is this correct?

Secondly, how would I show that it's unique? Am I right in thinking that if it's unique, as n gets bigger, $\displaystyle \lambda_{ML}$ tends to $\displaystyle \lambda$? If so, how do I show this?
• Aug 22nd 2009, 01:26 PM
pedrosorio
Yes, you're right.

You show that it's unique by construction. The MLE is found by computing the log-likelihood and finding its stationary points (where the derivative with respect to lambda is 0), you also have to show that the second derivative is negative, therefore showing that it corresponds to a maximum of the likelihood. In this case there's only one stationary point, and so this is the unique MLE.

How did you find the MLE?

As for the last question, a MLE is unbiased and I know that it is more asymptotically efficient (has a smaller mean squared error) than any other unbiased estimator, but I don't know how to prove it.
• Aug 22nd 2009, 01:56 PM
matheagle
Why do you think this estimator is unbiased?
It may be asymptotically unbiased.
But it doesn't look unbiased for $\displaystyle \lambda$ to me.
• Aug 22nd 2009, 02:09 PM
pedrosorio
Asymptotically unbiased was what I meant, sorry.
• Aug 22nd 2009, 02:11 PM
matheagle
That's what I thought.
MLEs are not always unbiased, especially with logarithms in the denominator.
• Aug 23rd 2009, 03:53 AM
Zenter
Quote:

Originally Posted by pedrosorio
Yes, you're right.

You show that it's unique by construction. The MLE is found by computing the log-likelihood and finding its stationary points (where the derivative with respect to lambda is 0), you also have to show that the second derivative is negative, therefore showing that it corresponds to a maximum of the likelihood. In this case there's only one stationary point, and so this is the unique MLE.

How did you find the MLE?

As for the last question, a MLE is unbiased and I know that it is more asymptotically efficient (has a smaller mean squared error) than any other unbiased estimator, but I don't know how to prove it.

I found the MLE like how you explained. I didn't do the second derivative though, but I've just done it, that comes up as negative :) It never twigged that it's unique because there's only one stationary point. Thanks.