# Thread: n or n-1 for SD?

1. ## n or n-1 for SD?

Hi, I recently saw this description for how to get variance, and thereby Standard Deviation:

"...sum of the squares of the differences between each observation and the mean, and then that quantity is divided by the sample size."

Then of course SD would be the root of that result. The sample size in their example, instead of being expressed as n, was expressed as n-1. I couldn't for the life of me figure out why, and I'm not a naturally mathy person, so I just 'took their word for it' and accepted it.

I later read the following description for finding the SD from the root-mean-average:

"...
The standard deviation of a sample can be found by calculating the Rms size of the deviation from the average. To find the SD you find the average, then find the difference for each observation from the average, and finally you calculate the Rms of those deviations."

In this situation, however, they divided by the unmodified sample size, so n and not n-1. How do I know when to subtract 1 from the sample size (ie - n or n-1)? It actually never gave an explanation as to why. Thanks,

David

2. Originally Posted by dkco
How do I know when to subtract 1 from the sample size (ie - n or n-1)? It actually never gave an explanation as to why. Thanks,
When given a sample from a population the standard deviation is $\displaystyle s = \sqrt{\frac{\sum (x-\bar{x})^2}{n-1}}$

Fidning the standard deviation from the entire population then $\displaystyle \sigma = \sqrt{\frac{\sum (x-\bar{x})^2}{n}}$