The Gauss-Markov Theorem says that the least squares is the best approximation because the variance is minimised? But the best approximation is when the average absolute deviation of the constructed curve from the real curve is minimised. And not when the variance of the constructed curve from the real curve is minimised. What do I get wrong?
Why is the best curve not the one where the maximum absolute deviation (or square deviation for that matter) is a minimum?
This if you have not noticed is a rhetorical question. What is best depends on context (why you want a regression curve, what you are going to do with it). There is not single "best" curve independent of purpose, almost always (though often well hidden) a decision problem.
February 11th 2011, 04:16 AM
Originally Posted by ThodorisK
Let me rephrase the question:
The best approximation curve (=the cunstructed curve which is closest to the real but unknown curve), is the constructed curve B where the data have the least variance from curve B (least squares method), and not the constructed curve A where the data have the least average absolute deviation from curve A?
If THAT is proven, the proof must have something to do with the normal distribution. And the reason we use the given definition of variance is related to that proof and the normal distribution.
If THAT is false, then why do we use the least squares method and not the other one? Only because of the "possibly multiple solutions":Least absolute deviations - Wikipedia, the free encyclopedia. Then that is the only reason the least squares is chosen, or the previous "THAT" also applies?
And the above has nothing to do with the reason we use the given definition of variance?
Since there is no "best" curve without specifying a condition of optimality (how you measure bestness) the above is moot.