Results 1 to 1 of 1

Math Help - Prove that b is an unbiased estimator of B

  1. #1
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Prove that b is an unbiased estimator of B( Have answer now. Need it checked please)

    Prove that the regression slope estimator b is an unbiased estimator of the true slope parameter B.

    I know that for b to be an unbiased estimator of B, E[b]=B but I'm not sure how to show it mathematically. I'd also like to see how to prove that a is an unbiased estimator of A.

    EDIT:

    This is what I have so far. Maybe it'll get give some people an idea of how to complete it.

    b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})y_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}

    y_{i}=a +\beta x_{i}+u_{i}

    So then,

    b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(a +\beta x_{i}+u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}

    So, after some fancy distribution.

    b=\frac{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})a +\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(\beta x_{i})+\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}

    We can see that

    {a \displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})=0

    \displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(x_{i})=\displaystyle  \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}

    So, then we have

    b=\frac{ \displaystyle (\beta)\displaystyle  \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}+\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})(u_{i})}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}

    I see to get stuck at this point though. Any help?

    EDIT 2:

    Ok, I've gotten further. It's amazing how much clearer things are when typed out instead of written in chicken scratch..

    Rewriting the equation as

    \frac{\displaystyle  \beta \sum_{i=1}^{n}(x_{i}-\bar{x})x_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}+\frac{\displaystyle  \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}

    We can see that

    \frac{\displaystyle  \beta \sum_{i=1}^{n}(x_{i}-\bar{x})x_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}=\frac{\displaystyle  \beta \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}=\beta

    So now we have

    b=\beta +\frac{\displaystyle  \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}

    From here, I think it's time to try to show that E[b]=\beta. So

    E[b]=\beta +\displaystyle E[\frac{\displaystyle  \sum_{i=1}^{n}(x_{i}-\bar{x})u_{i}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]

    Then, I know there's an assumption in my textbook that E[u_{i}]=0. Does that mean I can just do

    E[b]=\beta +E[\displaystyle \frac{\displaystyle  (x_{1}-\bar{x})u_{1}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]+E[\displaystyle \frac{\displaystyle  (x_{2}-\bar{x})u_{2}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]+...+E[\displaystyle \frac{\displaystyle  (x_{n}-\bar{x})u_{n}}{\displaystyle \sum_{i=1}^{n}(x_{i}-\bar{x})^{2}}]

    Which simplifies to

    E[b]=\beta + 0+0+...+0=\beta

    Can anyone confirm if this is correct or not? Thanks
    Last edited by downthesun01; December 6th 2010 at 10:26 PM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. unbiased estimator
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: November 23rd 2010, 08:36 AM
  2. Estimator Unbiased?
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: February 18th 2010, 05:58 PM
  3. Unbiased Estimator
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: April 12th 2009, 08:53 AM
  4. unbiased estimator
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: March 20th 2008, 04:20 AM
  5. Unbiased Estimator help
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: December 2nd 2007, 04:38 AM

Search Tags


/mathhelpforum @mathhelpforum