• August 15th 2010, 09:24 AM
user
To prove that mean quadratic error of a parameter is the sum of varianca of parameter and quadratic B of parameter. Thanks
• August 15th 2010, 02:34 PM
SpringFan25
Did you mean estimator instead of parameter? Outside of bayesian statistics, parameters do not normally have a variance.

Estimate: X
True Value: Y

$E(X-Y)^2 = E(X^2) - 2E(XY) + E(Y)^2$

$E(X-Y)^2 = E(X^2) - 2YE(X) + Y^2$

$E(X-Y)^2 = E(X^2) - 2YE(X) + Y^2 + E(X)^2 - E(X)^2$

$E(X-Y)^2 = E(X^2) - E(X)^2 - 2YE(X) + Y^2 + E(X)^2$

$E(X-Y)^2 = Var(X) + Y^2 - 2YE(X) + E(X)^2$

You may find it interesting to note that the quatratic ("B") factorises to give

$E(X-Y)^2 = Var(X) + E(Y - E(X))^2$

So that for an unbiased estimator, the MSE is equal to the variance of the estimate (as you'd expect from the definition of a variance).