So I learned that standard deviation is as such that if there is a set:

$\displaystyle \{x_1,x_2,x_3, ..., x_n\}$

with the average:

$\displaystyle a = \displaystyle{\sum_{i=1}^n \frac{x_i}n$

Then the standard deviation is:

$\displaystyle \sqrt{ \frac{(x_1 - a)^2 + (x_2 - a)^2 + ... + (x_n - a)^2}n }$

But a while ago, I was told that this works too (and I checked it, and it seems to work):

If given the set:

$\displaystyle \{x_1,x_2,x_3, ..., x_n\}$

with the average

$\displaystyle a = \displaystyle{\sum_{i=1}^n \frac{x_i}n$

and the average of the square of each $\displaystyle \tiny x_n$:

$\displaystyle s = \displaystyle{\sum_{i=1}^n \frac{(x_i)^2}n$

then the standard deviation is as follows:

$\displaystyle \sqrt{s - a^2}$

Why is it that the second way of getting standard deviation right? In other words, how do I derive the second way from the first way? The main reason I am having trouble figuring this out is that it seems to make the assumption that:

$\displaystyle \frac{(x_n - a)^2}n = \frac{(x_n)^2}n - a^2$

Which if I recall correctly, is not true because:

$\displaystyle \frac{(x_n - a)^2}n = \frac{(x_n)^2 - 2ax_n + a^2}n$