What is the asymptotic variance of an estimator? There seems to be several definitions on the web but I know the one I'm after includes the expectation of the second derivative of the log-likelihood.

Printable View

- Jan 4th 2012, 07:41 AMDukeAsymptotic Variance
What is the asymptotic variance of an estimator? There seems to be several definitions on the web but I know the one I'm after includes the expectation of the second derivative of the log-likelihood.

- Jan 4th 2012, 10:01 PMmatheagleRe: Asymptotic Variance
an estimator, by definition is a statistic, hence its a function of the data, which is the sample

$\displaystyle X_1,\ldots, X_n$

so it's a random variable and it has a distribution.

Which means it has a mean and a variance.

If its mean, its expectation is equal to the unknown parameter(which is a constant)

then that estimator is said to be unbiased.

Next we look at its variance.

And we like to see its long term behaviour.

That limit is the asymptotic variance.

What you seem to want is Fischer's...

http://ocw.mit.edu/courses/mathemati...s/lecture3.pdf