Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.

2. Originally Posted by skamoni
Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.
There is a problem with the formula given in Q1. The right hand side reduces to $\displaystyle c(X_n - X_1)$ and I can't see the expected value of this ever being equal to $\displaystyle \sigma^2$ .....

3. Originally Posted by skamoni
Hi, i posted this in the urgent maths help forum, but i thought it would be better here, any sort of help would be greatly appreciated, thanks.
An estimator $\displaystyle \Phi$ of a parameter $\displaystyle \theta$ is unbiased iff:

$\displaystyle E(\Phi)=\theta$

You first problem is to determine the value of $\displaystyle c$ so that:

$\displaystyle \overline{\sigma^2}=c\sum_{i=1}^{n-1}(X_{i+1}-X_i)$

is an unbiased estimator for the variance.

This is imposible unless the variance is $\displaystyle 0$ as the expectation of the right hand side is $\displaystyle 0$.

Now if instead it was meant to be:

$\displaystyle \overline{\sigma^2}=c\sum_{i=1}^{n-1}(X_{i+1}-X_i)^2$

we have:

$\displaystyle E(\overline{\sigma^2})=c\sum_{i=1}^{n-1}E[(X_{i+1}-X_i)^2]=c (n-1)E(X_{2}^2-2X_{2}X_1+X_1^2)$

This last bit of simplification because the $\displaystyle X_i$ are independently identically distributed.

So:

$\displaystyle E(\overline{\sigma^2})=c (n-1)(\overline{X^2}-2\bar{X}^2+\overline{X^2})=2c(n-1)\sigma^2$

so can you now tell us what value of $\displaystyle c$ will make $\displaystyle \overline{\sigma^2}$ an unbiased estimator?

RonL

4. ## thanks

yeh sorry i got the question wrong, you had to include the squared, thanks alot for your help