Hi,

I'm a bit perplexed by this and I wonder if someone can clarify things at all.

I understand (from my textbook) that the maximum likelihood estimator for the variance of a normal distribution is

$\displaystyle \hat{\sigma}^2 = \frac{1}{n} \sum_{i=1}^{n} (X_{i} - \bar{X}) $

But this is biased. Now that's all well and good, but I had understood that ML estimators are unbiased, yet plainly this is not always true. When is it true?

Or to put things another way: Why is the (unbiased) sample variance not the ML solution to the variance estimation problem?

Thanks in advance to anyone that can clear this up for me. MD