Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.

Printable View

- Apr 24th 2008, 03:22 PMskamoniBias estimators - please help
Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.

- Apr 25th 2008, 04:28 AMmr fantastic
- Apr 25th 2008, 05:18 AMCaptainBlack
An estimator $\displaystyle \Phi$ of a parameter $\displaystyle \theta$ is unbiased iff:

$\displaystyle E(\Phi)=\theta$

You first problem is to determine the value of $\displaystyle c$ so that:

$\displaystyle \overline{\sigma^2}=c\sum_{i=1}^{n-1}(X_{i+1}-X_i)$

is an unbiased estimator for the variance.

This is imposible unless the variance is $\displaystyle 0$ as the expectation of the right hand side is $\displaystyle 0$.

Now if instead it was meant to be:

$\displaystyle \overline{\sigma^2}=c\sum_{i=1}^{n-1}(X_{i+1}-X_i)^2$

we have:

$\displaystyle E(\overline{\sigma^2})=c\sum_{i=1}^{n-1}E[(X_{i+1}-X_i)^2]=c (n-1)E(X_{2}^2-2X_{2}X_1+X_1^2)$

This last bit of simplification because the $\displaystyle X_i$ are independently identically distributed.

So:

$\displaystyle E(\overline{\sigma^2})=c (n-1)(\overline{X^2}-2\bar{X}^2+\overline{X^2})=2c(n-1)\sigma^2$

so can you now tell us what value of $\displaystyle c$ will make $\displaystyle \overline{\sigma^2}$ an unbiased estimator?

RonL - Apr 25th 2008, 06:05 AMskamonithanks
yeh sorry i got the question wrong, you had to include the squared, thanks alot for your help