Results 1 to 14 of 14

Math Help - Mean of the residuals (e-bar)

  1. #1
    Newbie
    Joined
    Mar 2009
    Posts
    23

    Mean of the residuals (e-bar)

    So I can prove that the sum of the residuals equals zero.. but how do I prove that the mean of e equals zero? (e is the residuals in this case, not the irrational e)

    I've tried but all I end up doing is going around in a circle.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Junior Member
    Joined
    Nov 2008
    Posts
    58
    Thanks
    1
    Let R be the random variable denoting the residual error. Then
    R_i = X_i - \frac{1}{n}\sum_{j=1}^n X_j, i=1, ... ,n
    Then it follows that
    E[R_i] = E[X_i] - \frac{1}{n} \sum_{j=1}^n E[X_j]
    E[R_i] = \mu - \frac{1}{n} \sum_{j=1}^n \mu = 0
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    I proved the sum of the residuals is zero in another post here.
    BUT it's not true if you don't have an intercept term in your model.
    HERE it is
    http://www.mathhelpforum.com/math-he...-question.html
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Newbie
    Joined
    Mar 2009
    Posts
    23
    I have been told that the definition of the residual is:

    e = y(subscript i) - y(subscript i, hat)

    (If that makes sense)

    I understand the proof given by cl85 but I'm not sure how to fit it to the equation for e that I have been given.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Mar 2009
    Posts
    23
    I did indeed look at it, but that's the proof for the sum of the residuals - something I already understand how to do. I need help with the expected value of the residual.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    I do not even know what your model is, nor if you understand how to do least squares with matrices.
    But here's simple proof for ANY model that the expected value of the residuals is zero.

    The general model is Y=X\beta +\epsilon where the \epsilon is a column vector usually normal, with ZERO means.

    HENCE E(Y)=X\beta +E(\epsilon)=X\beta

    NOW the least squares fit is \hat\beta=(X^tX)^{-1}X^tY

    and it's real easy to prove that E(\hat\beta)=\beta.

    Our least squares fit is \hat Y=X\hat\beta and E(\hat Y)=XE(\hat\beta)=X\beta.

    SO the expected value of our residuals is  X\beta -X\beta=0.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Newbie
    Joined
    Mar 2009
    Posts
    23
    Ok, now I'm just red in the face. I feel like such an idiot.

    matheagle - You are brilliant. You really are. And.... yeah, I feel like an idiot. I get you now.

    Just know that this girl is very, very thankful to you.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    I'm flattered, but I don't know what I did.
    Don't let Mr Fantasy see your comments, he thinks I'm a prima dona.
    (And MOO may also be jealous too, so keep your comments to a minimum.)
    I did misread your earlier question.
    I thought that you wanted to see that the sum was zero.
    That's the usual question here.
    Without knowing your model I wasn't sure what you wanted me to do.
    I can easily prove this for any model w/o matrices.
    BUT matrices is the way to go.
    I'm typing the exam I'm giving on this topic right now.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Newbie
    Joined
    Mar 2009
    Posts
    23
    Is it, uhh... at all possible for you to prove it without matrices?
    Just that this question was given to us before we started using matrices, so I can't use matrices.

    And as for the model, our model for the regression line is:
    y=\hat\beta_0+\hat\beta_1 x
    with residual:
    e_i=y_i-\hat y_i

    Sorry I didn't give that before.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Sure, it works either way.
    BUT it doesn't matter what your model is.
    Have you proved that E(\hat\beta_0) =\beta_0 and E(\hat\beta_1) =\beta_1 that the estimators are unbiased for these parameters?

    Your original model is Y=\beta_0+\beta_1 x+\epsilon

    where the only random variable is the \epsilon and it has mean zero.

    THUS E(Y)=\beta_0+\beta_1 x.

    NOW if you have proved E(\hat\beta_0) =\beta_0 and E(\hat\beta_1) =\beta_1 it follows from \hat Y=\hat\beta_0+\hat\beta_1 x that E(\hat Y)=\beta_0+\beta_1 x.

    I just did the scratch work to prove E(\hat\beta_0) =\beta_0 and E(\hat\beta_1) =\beta_1 via the SSxy and SSxx formulas.
    Last edited by matheagle; May 9th 2009 at 10:03 PM.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Newbie
    Joined
    Mar 2009
    Posts
    23
    Thank you, thank you, thank you, thank you, thank you, thank you, THANK YOU.

    Mr Fantasy and MOO can be jealous for all I care. This helps so much.


    Sidenote: I hate that whenever I see an answer to a proof, it always looks so easy.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    do you need this with \hat\beta_0 =\bar Y-\hat\beta_1\bar x and \hat\beta_1 ={SS_{xy}\over SS_{xx}}...

    Start with Y_i=\beta_0 +\beta_1 x_i+\epsilon_i and \bar Y=\beta_0 +\beta_1 \bar x+\bar\epsilon

    E(Y_i)=\beta_0 +\beta_1 x_i+0

    so E(\hat\beta_1) ={\sum_{i=1}^n(x_i-\bar x)E(Y_i)\over \sum_{i=1}^n(x_i-\bar x)^2}

     ={\sum_{i=1}^n(x_i-\bar x)(\beta_0 +\beta_1 x_i)\over \sum_{i=1}^n(x_i-\bar x)^2}  =\beta_0 {\sum_{i=1}^n(x_i-\bar x)\over \sum_{i=1}^n(x_i-\bar x)^2}+\beta_1{\sum_{i=1}^n(x_i-\bar x)^2\over \sum_{i=1}^n(x_i-\bar x)^2}=\beta_1

    Next, E(\bar Y)=\beta_0 +\beta_1 \bar x+0

    SO, using the fact that \hat\beta_1 is unbiased for \beta_1

    we have E(\hat\beta_0)=E(\bar Y) -E(\hat\beta_1)\bar x= \beta_0 +\beta_1 \bar x-\beta_1 \bar x= \beta_0 .
    Last edited by matheagle; May 9th 2009 at 10:34 PM.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Corrections here...

    the model is y=\beta_0+\beta_1 x+\epsilon

    and the least squares fit (line through the data) is \hat y=\hat\beta_0+\hat\beta_1 x

    The \hat\beta's are the estimators (random variables) of the unknown parameters (constants) \beta's.

    Quote Originally Posted by blueirony View Post
    Is it, uhh... at all possible for you to prove it without matrices?
    Just that this question was given to us before we started using matrices, so I can't use matrices.

    And as for the model, our model for the regression line is:
    y=\hat\beta_0+\hat\beta_1 x
    with residual:
    e_i=y_i-\hat y_i

    Sorry I didn't give that before.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Residuals from a regression
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: February 22nd 2011, 10:50 AM
  2. Residuals Plot
    Posted in the Statistics Forum
    Replies: 1
    Last Post: July 12th 2010, 06:07 PM
  3. proof the mean of the residuals must always be zero
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: May 8th 2010, 07:17 AM
  4. R^2 fit disagrees with residuals
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: June 30th 2009, 04:43 AM
  5. How to calculate y hat and residuals
    Posted in the Statistics Forum
    Replies: 1
    Last Post: September 28th 2008, 08:06 PM

Search Tags


/mathhelpforum @mathhelpforum