1. ## Relative efficiency

Suppose that $\displaystyle Y_1, \ Y_2, \ \dotso,\ Y_n$ is a random sample from a normal distribution with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$. Given two unbiased estimators, find their relative efficiency.

$\displaystyle \sigma^2_1 = S^2 = \frac{1}{n-1} \sum^n_{i=1} (Y_i-\overline{Y})^2$ and $\displaystyle \sigma^2_2 =\frac{1}{2}(Y_1-Y_2)^2$

by definition the relative efficiency is $\displaystyle \frac{V[Y_2]}{V[Y_1]}$ so in this case I should have:

$\displaystyle \frac{\sigma^2_2}{\sigma^2_1} = \frac{\frac{1}{2}(Y_1-Y_2)^2}{\frac{1}{n-1} \sum^n_{i=1} (Y_i-\overline{Y})^2}$

I was thinking of expanding it out but all I got was $\displaystyle \frac{\sigma^2_2}{\sigma^2_1}$.

2. Originally Posted by lllll
Suppose that $\displaystyle Y_1, \ Y_2, \ \dotso,\ Y_n$ is a random sample from a normal distribution with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$. Given two unbiased estimators, find their relative efficiency.

$\displaystyle \sigma^2_1 = S^2 = \frac{1}{n-1} \sum^n_{i=1} (Y_i-\overline{Y})^2$ and $\displaystyle \sigma^2_2 =\frac{1}{2}(Y_1-Y_2)^2$

by definition the relative efficiency is $\displaystyle \frac{V[Y_2]}{V[Y_1]}$ so in this case I should have:

$\displaystyle \frac{\sigma^2_2}{\sigma^2_1} = \frac{\frac{1}{2}(Y_1-Y_2)^2}{\frac{1}{n-1} \sum^n_{i=1} (Y_i-\overline{Y})^2}$

I was thinking of expanding it out but all I got was $\displaystyle \frac{\sigma^2_2}{\sigma^2_1}$.
$\displaystyle Var(\hat{\sigma_1}^2)$:

It's well known that $\displaystyle \frac{(n-1)S^2}{\sigma^2}$ has a $\displaystyle \chi^2$ distribution with $\displaystyle n - 1$ degrees of freedom. It's also well known that the variance of a $\displaystyle \chi^2$ distribution with $\displaystyle \nu$ degrees of freedom is $\displaystyle 2\nu$. Therefore:

$\displaystyle Var\left(\frac{(n-1)S^2}{\sigma^2}\right) = 2(n-1)$

$\displaystyle \Rightarrow \frac{(n-1)^2}{\sigma^4} Var(S^2) = 2(n-1)$

$\displaystyle \Rightarrow Var(S^2) = \frac{2 \sigma^4}{n-1}$.

An approach similar to the one used below can also be used but it's more tedious.

----------------------------------------------------------------------------------------------------------------

$\displaystyle Var(\hat{\sigma_2}^2)$:

$\displaystyle Var\left( \frac{1}{2} (Y_1 - Y_2)^2 \right) = \frac{1}{4} Var((Y_1 - Y_2)^2) = \frac{1}{4} \left( E((Y_1 - Y_2)^4) - [E((Y_1 - Y_2)^2)]^2\right)$.

$\displaystyle E((Y_1 - Y_2)^2) = E(Y_1^2 - 2Y_1 Y_2 + Y_2^2) = E(Y_1^2) - 2 E(Y_1) \cdot E(Y_2) + E(Y_2^2)$ $\displaystyle = (\sigma^2 + \mu^2) - 2 \mu^2 + (\sigma^2 + \mu^2) = 2 \sigma^2$.

$\displaystyle E((Y_1 - Y_2)^4) = E(Y_1^4 - 4Y_1^3 Y_2 + 6 Y_1^2 Y_2^2 - 4 Y_1 Y_2^3 + Y_2^4)$ $\displaystyle = E(Y_1^4) - 4E(Y_1^3) E(Y_2) + 6 E(Y_1^2) E(Y_2^2) - 4 E(Y_1) E(Y_2^3) + E(Y_2^4)$

I'll cheat and get the moments $\displaystyle E(Y^4)$ and $\displaystyle E(Y^3)$ from here: http://en.wikipedia.org/wiki/Normal_distribution (the other moments were shown in my reply to your previous unbiased estimator question)

$\displaystyle = (\mu^4 + 6\mu^2 \sigma^2 + 3\sigma^4) - 4(\mu^3 + 3\mu \sigma^2) \mu + 6 (\mu^2 + \sigma^2)^2$ $\displaystyle - 4 \mu (\mu^3 + 3\mu \sigma^2) + (\mu^4 + 6\mu^2 \sigma^2 + 3\sigma^4) = 12 \sigma^4$.

Therefore $\displaystyle Var\left( \frac{1}{2} (Y_1 - Y_2)^2 \right) = \frac{1}{4} \left(12 \sigma^4 - 4 \sigma^4 \right) = 2 \sigma^4$.

--------------------------------------------------------------------------------------------------------------

Therefore $\displaystyle \frac{Var(\hat{\sigma_2}^2)}{Var(\hat{\sigma_1}^2) } = n - 1$.