# Sample Variance

• August 20th 2009, 08:17 AM
knighty
Sample Variance
From what I know, sample variance is 1/(n-1) summation (i = 1 to n) for (Xi - Xbar)^2

I have proven the expectation of the sample variance to be sigma.
Is there such thing as the variance of the sample variance?

Unlike finding the expectation, I'm stuck halfway while finding its variance as I can't evaluate Var[(Xi^2)] and Var[(Xbar)^2]. (Was able to evaluate E[(Xi^2)] and E[(Xbar)^2] when I expanded the sample variance when finding its expectation.)

Any hints?
• August 20th 2009, 09:29 AM
Moo
Hello,

Actually, the sample variance is $\frac 1n \sum_{i=1}^n (X_i-\bar X)^2$

if you divide by 1/(n-1), it's the unbiased estimator for the variance.

For Var(Xi^2), consider the formula of the variance Var(Z)=E(Z²)-[E(Z)]^2
So here, we have Var(Xi^2)=E(Xi^4)-[E(Xi^2)]^2

For E(Xi^4), you'll have to calculate this on your own (using the mgf, or the pdf)
For E(Xi^2), this is just Var(Xi)+[E(Xi)]^2

For Var([X bar]^2), this is complicated, but still can be done.

------------------------------------------------
But, I feel like you didn't use correctly the formula for the variance... Because Var(X+Y)=Var(X)+Var(Y) if there is independence...
• August 20th 2009, 03:33 PM
matheagle
There are many estimates of the population variance, any of them can be called the sample variance.
The point usually is to estimate the st deviation. The most natural one, but messy one is...

$\hat\sigma={\sum_{i=1}^n |X_i-\overline X|\over n}$

This was suggested by Fisher a long tme ago, but it has lousy properties,
mainly because the absolute values makes it nondifferentiable.

AND you can find the variance of any random variable, since

$S^2={\sum_{i=1}^n (X_i-\overline X)^2\over n-1}$

is comprised of random variables you can find its mean, variance, probability distribution......