To prove that mean quadratic error of a parameter is the sum of varianca of parameter and quadratic B of parameter. Thanks

Printable View

- Aug 15th 2010, 09:24 AMuserMean quadratic error
To prove that mean quadratic error of a parameter is the sum of varianca of parameter and quadratic B of parameter. Thanks

- Aug 15th 2010, 02:34 PMSpringFan25
Did you mean estimator instead of parameter? Outside of bayesian statistics, parameters do not normally have a variance.

Estimate: X

True Value: Y

$\displaystyle E(X-Y)^2 = E(X^2) - 2E(XY) + E(Y)^2$

$\displaystyle E(X-Y)^2 = E(X^2) - 2YE(X) + Y^2$

$\displaystyle E(X-Y)^2 = E(X^2) - 2YE(X) + Y^2 + E(X)^2 - E(X)^2$

$\displaystyle E(X-Y)^2 = E(X^2) - E(X)^2 - 2YE(X) + Y^2 + E(X)^2$

$\displaystyle E(X-Y)^2 = Var(X) + Y^2 - 2YE(X) + E(X)^2$

You may find it interesting to note that the quatratic ("B") factorises to give

$\displaystyle E(X-Y)^2 = Var(X) + E(Y - E(X))^2 $

So that for an unbiased estimator, the MSE is equal to the variance of the estimate (as you'd expect from the definition of a variance).