So I learned that standard deviation is as such that if there is a set:

with the average:

Then the standard deviation is:

But a while ago, I was told that this works too (and I checked it, and it seems to work):

If given the set:

with the average

and the average of the square of each :

then the standard deviation is as follows:

Why is it that the second way of getting standard deviation right? In other words, how do I derive the second way from the first way? The main reason I am having trouble figuring this out is that it seems to make the assumption that:

Which if I recall correctly, is not true because: