Let R be the random variable denoting the residual error. Then
Then it follows that
So I can prove that the sum of the residuals equals zero.. but how do I prove that the mean of e equals zero? (e is the residuals in this case, not the irrational e)
I've tried but all I end up doing is going around in a circle.
I proved the sum of the residuals is zero in another post here.
BUT it's not true if you don't have an intercept term in your model.
HERE it is
I do not even know what your model is, nor if you understand how to do least squares with matrices.
But here's simple proof for ANY model that the expected value of the residuals is zero.
The general model is where the is a column vector usually normal, with ZERO means.
NOW the least squares fit is
and it's real easy to prove that .
Our least squares fit is and .
SO the expected value of our residuals is .
I'm flattered, but I don't know what I did.
Don't let Mr Fantasy see your comments, he thinks I'm a prima dona.
(And MOO may also be jealous too, so keep your comments to a minimum.)
I did misread your earlier question.
I thought that you wanted to see that the sum was zero.
That's the usual question here.
Without knowing your model I wasn't sure what you wanted me to do.
I can easily prove this for any model w/o matrices.
BUT matrices is the way to go.
I'm typing the exam I'm giving on this topic right now.
Is it, uhh... at all possible for you to prove it without matrices?
Just that this question was given to us before we started using matrices, so I can't use matrices.
And as for the model, our model for the regression line is:
Sorry I didn't give that before.
Sure, it works either way.
BUT it doesn't matter what your model is.
Have you proved that and that the estimators are unbiased for these parameters?
Your original model is
where the only random variable is the and it has mean zero.
NOW if you have proved and it follows from that .
I just did the scratch work to prove and via the SSxy and SSxx formulas.