Prove that the regression slope estimator b is an unbiased estimator of the true slope parameter B.

I know that for b to be an unbiased estimator of B, E[b]=B but I'm not sure how to show it mathematically. I'd also like to see how to prove that a is an unbiased estimator of A.

EDIT:

This is what I have so far. Maybe it'll get give some people an idea of how to complete it.

$\displaystyle b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})y_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

$\displaystyle y_{i}=a +\beta x_{i}+u_{i}$

So then,

$\displaystyle b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(a +\beta x_{i}+u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

So, after some fancy distribution.

$\displaystyle b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})a +\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(\beta x_{i})+\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

We can see that

$\displaystyle {a \displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})=0$

$\displaystyle \displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(x_{i})=\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}$

So, then we have

$\displaystyle b=\frac{ \displaystyle (\beta)\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}+\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

I see to get stuck at this point though. Any help?

EDIT 2:

Ok, I've gotten further. It's amazing how much clearer things are when typed out instead of written in chicken scratch..

Rewriting the equation as

$\displaystyle \frac{\displaystyle \beta \sum_{i=1}^{n}(x_{i}-\bar{x})x_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}+\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

We can see that

$\displaystyle \frac{\displaystyle \beta \sum_{i=1}^{n}(x_{i}-\bar{x})x_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}=\frac{\displaystyle \beta \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}=\beta$

So now we have

$\displaystyle b=\beta +\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}$

From here, I think it's time to try to show that $\displaystyle E[b]=\beta$. So

$\displaystyle E[b]=\beta +\displaystyle E[\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]$

Then, I know there's an assumption in my textbook that $\displaystyle E[u_{i}]=0$. Does that mean I can just do

$\displaystyle E[b]=\beta +E[\displaystyle \frac{\displaystyle (x_{1}-\bar{x})u_{1}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]+E[\displaystyle \frac{\displaystyle (x_{2}-\bar{x})u_{2}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]+...+E[\displaystyle \frac{\displaystyle (x_{n}-\bar{x})u_{n}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]$

Which simplifies to

$\displaystyle E[b]=\beta + 0+0+...+0=\beta$

Can anyone confirm if this is correct or not? Thanks