Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.

2. Originally Posted by skamoni
Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.
There is a problem with the formula given in Q1. The right hand side reduces to $c(X_n - X_1)$ and I can't see the expected value of this ever being equal to $\sigma^2$ .....

3. Originally Posted by skamoni
Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.
An estimator $\Phi$ of a parameter $\theta$ is unbiased iff:

$E(\Phi)=\theta$

You first problem is to determine the value of $c$ so that:

$\overline{\sigma^2}=c\sum_{i=1}^{n-1}(X_{i+1}-X_i)$

is an unbiased estimator for the variance.

This is imposible unless the variance is $0$ as the expectation of the right hand side is $0$.

Now if instead it was meant to be:

$\overline{\sigma^2}=c\sum_{i=1}^{n-1}(X_{i+1}-X_i)^2$

we have:

$E(\overline{\sigma^2})=c\sum_{i=1}^{n-1}E[(X_{i+1}-X_i)^2]=c (n-1)E(X_{2}^2-2X_{2}X_1+X_1^2)$

This last bit of simplification because the $X_i$ are independently identically distributed.

So:

$E(\overline{\sigma^2})=c (n-1)(\overline{X^2}-2\bar{X}^2+\overline{X^2})=2c(n-1)\sigma^2$

so can you now tell us what value of $c$ will make $\overline{\sigma^2}$ an unbiased estimator?

RonL

4. ## thanks

yeh sorry i got the question wrong, you had to include the squared, thanks alot for your help