variance of sample mean and sample standard deviation

Hi all!

I would like to know how to derive the expected variance of sample mean values and sample standard deviation.

(I am not a statistician so please forgive my sparse use of formulas.)

I have the following scenario:

1. 10 drawings from a normal distribution (M=0; SD = 1)

2a. Calculate the mean valure from these 10 drawings.

2b. Calculate the standard deviation from these 10 drawings.

3. Repeat steps 1 and 2.

Now, I would like to know the expectancy values for the standard deviation of the mean values from 2a and the standard deviation from 2b.

For the mean, it is simply SDm=((mean/sqrt(10)))^0.5

For the standard deviation, my current suggestion is:

SDsd = sqrt(0.18)^0.5

0.18 -> based on the sample variance distribution for N = 10 in a normal distribution

(Sample Variance Distribution -- from Wolfram MathWorld)

So, do my considerations make sense?

Also, in more general terms: I wonder if it is correct that the estimated standard deviation is lower for mean values than for standard deviations, since SDm > SDsd for all N.

In other words: Estimations of the standard deviation of the standard deviations are more precise than estimations of the standard deviation of sample means!?

thanks in advance for your help!

trantor