# Unbiased estimators

• Aug 3rd 2009, 05:50 AM
chella182
Unbiased estimators
So I'm revising my stats by doing a past exam paper, and I'm stuck on one of the questions about unbiased estimators. The question goes...

Suppose that $\displaystyle X_1$, $\displaystyle X_2$,..., $\displaystyle X_n$ are a random sample from a population with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$.
a) Prove that the mean estimator $\displaystyle \bar{X}$ is unbiased for $\displaystyle \mu$. State carefully any formulae you use.
b) Prove that $\displaystyle \bar{X}$ has variance $\displaystyle \frac{\sigma^2}{n}$. State carefully any formulae you use.
It's probably really simple like the last unbiased estimator Q I posted a while back, but I'm just stumped.
• Aug 3rd 2009, 06:13 AM
CaptainBlack
Quote:

Originally Posted by chella182
So I'm revising my stats by doing a past exam paper, and I'm stuck on one of the questions about unbiased estimators. The question goes...

Suppose that $\displaystyle X_1$, $\displaystyle X_2$,..., $\displaystyle X_n$ are a random sample from a population with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$.
a) Prove that the mean estimator $\displaystyle \bar{X}$ is unbiased for $\displaystyle \mu$. State carefully any formulae you use.
b) Prove that $\displaystyle \bar{X}$ has variance $\displaystyle \frac{\sigma^2}{n}$. State carefully any formulae you use.

It's probably really simple like the last unbiased estimator Q I posted a while back, but I'm just stumped.

$\displaystyle \overline{X}=\frac{1}{n}\sum_{i=1}^n X_i$

By linearity of the expectation operator:

$\displaystyle E(\overline{X})=\frac{1}{n}\sum_{i=1}^n E(X_i)$

but by definition $\displaystyle E(X_i)=\mu$, so:

$\displaystyle E(\overline{X})=\frac{1}{n}\sum_{i=1}^n \mu=\mu$

Which is the definition of an unbiased estimator for $\displaystyle \mu$.

CB
• Aug 3rd 2009, 06:16 AM
chella182
Thank you. Does sound rather simple, I just find it difficult to get my head around this sort of stuff.