# Variance...why square?

• January 3rd 2006, 02:53 AM
jimbot
Variance...why square?
Sorry to try anyones patience but now I found this place I'm taking advantage!

OK. I don't quite get this. In calculating variance you sum (X-mean(X))^2 no? (And then you divide by n...I know). I understand that the squaring will get rid of the sign but why not use the modulus/absolute value of X-mean(X)?

Jimbo (math duffer)
• January 3rd 2006, 02:17 PM
ThePerfectHacker
Well, squaring is needed because it is "reasonalbe" to measure deviation in that way.
However, there is a simplified Variance formula. It is useful if you are doing it by hand:
"Squared-mean minus mean-squared".
Mean is the average squared. And squared-mean is the mean of the squares. Thus,
$\frac{1}{n}\sum^n_{k=1}(x_k-\mu)^2=\sigma-(\mu)^{2}$
Where $\mu$ is the mean, and $\sigma$ is the squared-mean. It is also easy to remember because "mean" means the average thus it is in the middle of the sentence. NOT! Mean-squared minus squared-mean. Now to find the standard deviation just take the square root of the variance.
• January 3rd 2006, 09:10 PM
CaptainBlack
Quote:

Originally Posted by jimbot
Sorry to try anyones patience but now I found this place I'm taking advantage!

OK. I don't quite get this. In calculating variance you sum (X-mean(X))^2 no? (And then you divide by n...I know). I understand that the squaring will get rid of the sign but why not use the modulus/absolute value of X-mean(X)?

Jimbo (math duffer)

There are a lot of reasons for using the variance defined as the mean square
deviation from the mean, but one of the main ones is that it gives something
which can be differentiated. This is useful as it makes it possible to do all
sorts of interesting things with distributions using the techniques of
calculus without always having to worry about exceptional points where the
derivative does not exist.

RonL
• January 3rd 2006, 09:14 PM
CaptainBlack
Quote:

Originally Posted by ThePerfectHacker
Well, squaring is needed because it is "reasonalbe" to measure deviation in that way.
However, there is a simplified Variance formula. It is useful if you are doing it by hand:
"Squared-mean minus mean-squared".
Mean is the average squared. And squared-mean is the mean of the squares. Thus,
$\frac{1}{n}\sum^n_{k=1}(x_k-\mu)^2=\sigma-(\mu)^{2}$
Where $\mu$ is the mean, and $\sigma$ is the squared-mean,

The use of non-standard notation is confusing to readers who
want to know the answer. Please stick to using the standard notation
appropriate to the subject matter.

RonL
• January 4th 2006, 03:11 PM
ThePerfectHacker
Continuing in CaptainBlacks explanation the problem with using the Mean-Deviation is because there are problems with using absolute value, because there is a point where we cannot diffrenciate and it changes in different types of equations.
• January 29th 2006, 06:20 PM
rabeldin
Going further into the usefulness of the mean squared deviation from the mean, it also provides a basis for some important theoretical considerations such as Chebychef's theorem and the Central Limit Theorem. These allow us to calculate approximate probabilities given the mean and variance of a distribution. There are no comparable theoretical developments from the mean deviation.