I have this book that explains standard deviation in 2 ways. One uses the symbol sigma, and one with a lower-case letter s. With sigma, the sum of the distance from the average squared is divided by n, but with s it is divided by n-1. How can they both be the standard deviation? The book said something about sigma being for a population and s for a sample.. please explain.