# Thread: variance of ybar proof question

1. ## variance of ybar proof question

In this derivation the variance of y-bar given x is shown to be N * sigma^2

However when I derive it ybar = beta_0 + beta_1Xbar + Ubar which boils down to the variance of Ubar because beta_0 beta_1 and Xbar are constants once we have drawn our sample.

Then we have: Var(Ubar) = (1/n)^2 Var(U_i) = (n times sigma^2)/n^2

which is sigma^2/n

and that is what I know the var(ybar) to be, how does this gel with the above proof?

2. ## Re: variance of ybar proof question

Hey kingsolomonsgrave.

I take it that this is a population mean regression model for the data. Even so, your distributions of the parameters you are estimating should be consistent and this means that variance shrinks when you have more data.

Intuitively this looks completely wrong, but I guess you should first start by defining the distributions of u_i|x first for this particular population model. Asymptotically, they should take a normal distribution with the estimator being consistent (i.e. have 1/N property of the variance as opposed to N*sigma^2).

3. ## Re: variance of ybar proof question

I think you are right, I think the final result should be sigma^2/n. i think the mistake is that we are finding the variance of Ubar so it should be var(1/nUbar) = 1/n^2var(Ubar) = nsigma^2/n^2 = sigma^2/n

does that make sense?

4. ## Re: variance of ybar proof question

That seems to make sense: it's what we would expect for any estimator that estimates the mean regardless of its distribution.