From what I know, sample variance is 1/(n-1) summation (i = 1 to n) for (Xi - Xbar)^2
I have proven the expectation of the sample variance to be sigma.
Is there such thing as the variance of the sample variance?
Unlike finding the expectation, I'm stuck halfway while finding its variance as I can't evaluate Var[(Xi^2)] and Var[(Xbar)^2]. (Was able to evaluate E[(Xi^2)] and E[(Xbar)^2] when I expanded the sample variance when finding its expectation.)
Actually, the sample variance is
if you divide by 1/(n-1), it's the unbiased estimator for the variance.
For Var(Xi^2), consider the formula of the variance Var(Z)=E(Zē)-[E(Z)]^2
So here, we have Var(Xi^2)=E(Xi^4)-[E(Xi^2)]^2
For E(Xi^4), you'll have to calculate this on your own (using the mgf, or the pdf)
For E(Xi^2), this is just Var(Xi)+[E(Xi)]^2
For Var([X bar]^2), this is complicated, but still can be done.
But, I feel like you didn't use correctly the formula for the variance... Because Var(X+Y)=Var(X)+Var(Y) if there is independence...
There are many estimates of the population variance, any of them can be called the sample variance.
The point usually is to estimate the st deviation. The most natural one, but messy one is...
This was suggested by Fisher a long tme ago, but it has lousy properties,
mainly because the absolute values makes it nondifferentiable.
AND you can find the variance of any random variable, since
is comprised of random variables you can find its mean, variance, probability distribution......