# Thread: Confidence Interval for Variances

1. ## Confidence Interval for Variances

I understand that

$\frac{(n-1)S^2}{\sigma ^2} = \frac{(n-1)\frac{1}{n-1}\sum_{i=1}^{n}(x_i - \bar{x})^2}{\sigma ^2}$ ~ $\chi ^2(n-1)$

I do not understand why

$\frac{n\hat{\sigma}^2}{\sigma ^2} = \frac{n\frac{1}{n}\sum_{i=1}^{n}(x_i - \bar{x})^2}{\sigma ^2}$ ~ $\chi ^2(n-2)$

It's unclear why you have n-2 degrees of freedom, most likely you're estimating TWO parameters.

And what is $\hat\sigma^2$ here?
$\hat\sigma^2$ can be any estimator of $\sigma^2$
that most likely will answer the n-2 df question.

3. Originally Posted by matheagle
It's unclear why you have n-2 degrees of freedom, most likely you're estimating TWO parameters.

And what is $\hat\sigma^2$ here?
$\hat\sigma^2$ can be any estimator of $\sigma^2$
that most likely will answer the n-2 df question.
Thanks so much, I figured that wouldn't be enough, but I wasn't sure.

I'm working on regressions. So, x's are the known constants, and the y's are the random variables.

$\hat\sigma^2 = \sum_{i = 1}^{n}(y_i - \hat y)^2$,

and,

$\hat y = \hat\alpha + \hat\beta (x - \bar x)$

4. I figured that.
You should have said you were estimating two parameters.
That's why you have 2 degrees of freedom in the chi-square.
$\hat\sigma^2$ is the MSE here AND not the usual sample variance.
And I'm not so sure about the n in the formula.

5. Originally Posted by matheagle
I figured that.
You should have said you were estimating two parameters.
That's why you have 2 degrees of freedom in the chi-square.
$\hat\sigma^2$ is the MSE here AND not the usual sample variance.
And I'm not so sure about the n in the formula.
Thanks! So basically, whats the rule with chi-squared distributions and degrees of freedom. My book is getting very frustrating because they're starting to use certain concepts a little to liberally without explaining where they came from or why they are in the form the are; this is a good example.

Thanks again.

6. $Y_{i}$ follows a normal distribution.

so $\Big(\frac{Y_{i}-\bar{Y_{i}}}{\sigma}\Big)^{2}$ follows a chi-square distribution with 1 degree of freedom

and $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\bar{Y_{i}})}{\sigma}\Big)^{2}$ follows a chi-square distribution with n degrees of freedom (since you are summing n independent chi-square variables)

so you would expect that $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\hat{Y_{i}})}{\sigma}\Big)^{2} = \frac{n \hat{\sigma^{2}}}{\sigma^{2}}$ to follow a chi-square distribution with n-2 degrees of freedom because you lose two degrees of freedom estimating the two regression coefficients

7. Just to clarify...
when you write something like $\hat\sigma^2$ the idea is that it estimates $\sigma^2$

so really you should have something like $\hat\sigma^2={\sum(Y_i-\hat Y)^2\over n}$

or really the MSE here is $\hat\sigma^2={\sum(Y_i-\hat Y)^2\over n-2}$

since MSE=SSE/(n-2) is unbiased for $\sigma^2$

8. I don't agree with this...........

Originally Posted by Random Variable
$Y_{i}$ follows a normal distribution.

so $\Big(\frac{Y_{i}-\bar{Y_{i}}}{\sigma}\Big)^{2}$ follows a chi-square distribution with 1 degree of freedom

and $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\bar{Y_{i}})}{\sigma}\Big)^{2}$ follows a chi-square distribution with n degrees of freedom (since you are summing n independent chi-square variables)

so you would expect that $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\hat{Y_{i}})}{\sigma}\Big)^{2} = \frac{n \hat{\sigma^{2}}}{\sigma^{2}}$ to follow a chi-square distribution with n-2 degrees of freedom because you lose two degrees of freedom estimating the two regression coefficients
I agree with ....

$\Big(\frac{Y_{i}-E{Y_{i}}}{\sigma}\Big)^{2}$ follows a chi-square distribution with 1 degree of freedom

and
and $\frac{\sum_{i=1}^{n}(Y_{i}-\bar{Y_{i}})^2}{\sigma^2}$ follows a chi-square distribution with n-1 degrees of freedom

while

and ${\sum_{i=1}^{n}(Y_{i}-E(Y))^2\over \sigma^2}$ follows a chi-square distribution with n degrees of freedom (since you are summing n independent chi-square variables)

9. In response to matheagle, let me change a few things:

$Y_{i}$ follows a normal distribution.

so $\Big(\frac{Y_{i}-\alpha - \beta x_{i}}{\sigma}\Big)^{2}$ follows a chi-square distribution with 1 degree of freedom

and $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\alpha-\beta x_{i}) }{\sigma}\Big)^{2}$ follows a chi-square distribution with n degrees of freedom (since you are summing n independent chi-square variables)

so you would expect that $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\hat{\alpha}-\hat{\beta} x_{i})}{\sigma}\Big)^{2} = \frac{n \hat{\sigma^{2}}}{\sigma^{2}}$ to follow a chi-square distribution with n-2 degrees of freedom because you lose two degrees of freedom estimating the two regression coefficients

10. Originally Posted by Random Variable
In response to matheagle, let me change a few things:

$Y_{i}$ follows a normal distribution.

so $\Big(\frac{Y_{i}-\alpha - \beta x_{i}}{\sigma}\Big)^{2}$ follows a chi-square distribution with 1 degree of freedom
Got it.

Originally Posted by Random Variable
and $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\alpha-\beta x_{i}) }{\sigma}\Big)^{2}$ follows a chi-square distribution with n degrees of freedom (since you are summing n independent chi-square variables)
Sure.

Originally Posted by Random Variable
so you would expect that $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\hat{\alpha}-\hat{\beta} x_{i})}{\sigma}\Big)^{2} = \frac{n \hat{\sigma^{2}}}{\sigma^{2}}$ to follow a chi-square distribution with n-2 degrees of freedom because you lose two degrees of freedom estimating the two regression coefficients
What? How do you lose degrees of freedom? I think I'm lacking a fundamental understanding of just what a degree of freedom is. I understand how it relates to the Gamma distribution and it is actually the mean of the Chi-square distribution. But I wouldn't know something followed a Chi-square distribution if it was staring me in the face, let alone how many degrees of freedom it employs.

11. Originally Posted by Random Variable
In response to matheagle, let me change a few things:

$Y_{i}$ follows a normal distribution.

so $\Big(\frac{Y_{i}-\alpha - \beta x_{i}}{\sigma}\Big)^{2}$ follows a chi-square distribution with 1 degree of freedom

and $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\alpha-\beta x_{i}) }{\sigma}\Big)^{2}$ follows a chi-square distribution with n degrees of freedom (since you are summing n independent chi-square variables)

so you would expect that $\Big(\frac{\sum_{i=1}^{n}(Y_{i}-\hat{\alpha}-\hat{\beta} x_{i})}{\sigma}\Big)^{2} = \frac{n \hat{\sigma^{2}}}{\sigma^{2}}$ to follow a chi-square distribution with n-2 degrees of freedom because you lose two degrees of freedom estimating the two regression coefficients

Don't you think that the square should be inside the sum not outside the sum?
Then I agree, because $\alpha$ is a parameter (an expected value) while $\hat\alpha$ is a statistic.

12. Degrees of freedom can be looked at as the number of independent pieces of information that go into estimating a parameter. By first estimating the 2 regression coefficients, you are constraining the residuals to lie within a certain vector space defined by the 2 normal equations. Therefore, you are working with two less independent pieces of information.

13. and MSE is really SSE/(n-2) not SSE/n.
That makes it unbiased for $\sigma^2$

14. Originally Posted by matheagle
Don't you think that the square should be inside the sum not outside the sum?
Yes. Anything else I did wrong?