Greetings. I am new to this forum so thank you in advance for the help.
From what I understand, There is a "3 sigmas" rule applied to standard deviation which states that about 68.3% of a set of numbers will lie within one standard deviation from the mean, about 95.4% will lie within two standard deviations from the mean and about 99.7% will lie within three standard deviations of the mean.
So when I try to apply this rule I go into Excel and make a column of 100 numbers in order from 1 to 100. The mean is 50.5. Using Excel's Standard Deviation function reveals the standard deviation for this set is 28.87. Adding the standard deviation to the mean is 79.4. Subtracting the standard deviation from the mean is 21.6. So, according to the 3 Sigmas rule, shouldn't there be about 68 of the numbers which lie between 79.4 and 21.6? In fact there are 58.
Furthermore, adding two sigmas above the mean gives a number above 100, the maximun number in the list (two sigmas above mean is 108.2).
I must be failing to apply this correctly. Please help and thanks again!