If I have a simple model where Yi equals some constant plus an error term like Yi = B + Ei To minimize the SSE of this model, I would use the mean value of Y for B. How can I prove that the mean of Y minimizes the SSE?
Follow Math Help Forum on Facebook and Google+
Originally Posted by DannyOcean If I have a simple model where Yi equals some constant plus an error term like Yi = B + Ei To minimize the SSE of this model, I would use the mean value of Y for B. How can I prove that the mean of Y minimizes the SSE? Let be our estimate of the value of , then the SSE corresponding to x is: differentiate and set to zero to find the that minimises . CB
View Tag Cloud