calculating variance for a bunch of random variables -- (n-1) or (n)?

We're learning about distributions (binomial etc.), and we were assigned a program where we have to generate m random binomial variables + calculate its standard deviation and expectation. Then, compare that with the standard deviation and expectation for the probability mass function itself.

In my book, we are ALWAYS calculating variance etc. for PMFs. For this reason, i'm confused about the two different ways i'm finding to calculate variance for a bunch of actual data. In fact, I wouldn't have even been aware of the other way except for I checked my code with matlab's built in var(x) function (which uses the other way! the one where you divide by (n-1) instead of n.)

So which way do you think I'm supposed to calculate variance for this project with? What's the difference between the two ways to calculate the variance?