# Math Help - Prove that b is an unbiased estimator of B

1. ## Prove that b is an unbiased estimator of B( Have answer now. Need it checked please)

Prove that the regression slope estimator b is an unbiased estimator of the true slope parameter B.

I know that for b to be an unbiased estimator of B, E[b]=B but I'm not sure how to show it mathematically. I'd also like to see how to prove that a is an unbiased estimator of A.

EDIT:

This is what I have so far. Maybe it'll get give some people an idea of how to complete it.

$b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})y_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

$y_{i}=a +\beta x_{i}+u_{i}$

So then,

$b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(a +\beta x_{i}+u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

So, after some fancy distribution.

$b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})a +\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(\beta x_{i})+\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

We can see that

${a \displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})=0$

$\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(x_{i})=\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}$

So, then we have

$b=\frac{ \displaystyle (\beta)\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}+\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

I see to get stuck at this point though. Any help?

EDIT 2:

Ok, I've gotten further. It's amazing how much clearer things are when typed out instead of written in chicken scratch..

Rewriting the equation as

$\frac{\displaystyle \beta \sum_{i=1}^{n}(x_{i}-\bar{x})x_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}+\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

We can see that

$\frac{\displaystyle \beta \sum_{i=1}^{n}(x_{i}-\bar{x})x_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}=\frac{\displaystyle \beta \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}=\beta$

So now we have

$b=\beta +\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

From here, I think it's time to try to show that $E[b]=\beta$. So

$E[b]=\beta +\displaystyle E[\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]$

Then, I know there's an assumption in my textbook that $E[u_{i}]=0$. Does that mean I can just do

$E[b]=\beta +E[\displaystyle \frac{\displaystyle (x_{1}-\bar{x})u_{1}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]+E[\displaystyle \frac{\displaystyle (x_{2}-\bar{x})u_{2}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]+...+E[\displaystyle \frac{\displaystyle (x_{n}-\bar{x})u_{n}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]$

Which simplifies to

$E[b]=\beta + 0+0+...+0=\beta$

Can anyone confirm if this is correct or not? Thanks