# Least Squares Parabola

• June 9th 2012, 10:44 AM
rfhrth
Least Squares Parabola
Hi all

I have been trying to find the values of the coefficients for the least squares parabola:

y = a + bx + cx^2

I have taken partial derivatives wrt a, b, c and now have three equations, but I don't know what to do next...any suggestions? Have you solved this before?

thank you very much!!

I can help with LaTeX in exchange for help thank you very much
• June 9th 2012, 08:15 PM
Prove It
Re: Least Squares Parabola
Quote:

Originally Posted by rfhrth
Hi all

I have been trying to find the values of the coefficients for the least squares parabola:

y = a + bx + cx^2

I have taken partial derivatives wrt a, b, c and now have three equations, but I don't know what to do next...any suggestions? Have you solved this before?

thank you very much!!

I can help with LaTeX in exchange for help thank you very much

Do you have a set of data?
• June 10th 2012, 01:20 PM
rfhrth
Re: Least Squares Parabola
Hi

Thank you for response. I don't have data, but I am try to show that a,b,c are minimization. I took derivatives of proper equation wrt a,b,c set to zero but now am unsure of how to solve for a,b,c explicit

Thank you very much
• June 10th 2012, 09:13 PM
Prove It
Re: Least Squares Parabola
Quote:

Originally Posted by rfhrth
Hi

Thank you for response. I don't have data, but I am try to show that a,b,c are minimization. I took derivatives of proper equation wrt a,b,c set to zero but now am unsure of how to solve for a,b,c explicit

Thank you very much

Show that a,b,c are minimisation? That makes no sense. Anyway, the easiest way to get the values of a,b,c is with matrix algebra.

Suppose you had a set of data \displaystyle \begin{align*} (x_1, y_1), (x_2, y_2), (x_3, y_3), \dots, (x_n, y_n) \end{align*}, substituting them into your least squares model \displaystyle \begin{align*} y = a + b\,x + c\,x^2 \end{align*} you get a system of equations which you can write in matrix form

\displaystyle \begin{align*} y_1 &= a + b\,x_1 + c\,x_1^2 \\ y_2 &= a + b\,x_2 + c\,x_2^2 \\ y_3 &= a + b\,x_3 + c\,x_3^2 \\ \vdots \\ y_n &= a + b\,x_n + c\,x_n^2 \\ \\ \left[\begin{matrix} y_1 \\ y_2 \\ y_3 \\ \vdots \\ y_n \end{matrix}\right] &= \left[\begin{matrix} 1 & x_1 & x_1^2 \\ 1 & x_2 & x_2^2 \\ 1 & x_3 & x_3^2 \\ \vdots & \vdots & \vdots \\ 1 & x_n & x_n^2 \end{matrix}\right] \left[\begin{matrix} a \\ b \\ c \end{matrix}\right] \\ \mathbf{Y} &= \mathbf{X}\mathbf{c} \end{align*}

If \displaystyle \begin{align*} \mathbf{X} \end{align*} was a square matrix, you would be able to solve for \displaystyle \begin{align*} \mathbf{c} \end{align*} by premultiplying both sides by \displaystyle \begin{align*} \mathbf{X}^{-1} \end{align*}. But because we have more equations than unknowns, \displaystyle \begin{align*} \mathbf{X} \end{align*} is not square. But we can make is square by premultiplying both sides by \displaystyle \begin{align*} \mathbf{X}^T \end{align*} first.

\displaystyle \begin{align*} \mathbf{Y} &= \mathbf{X}\mathbf{c} \\ \mathbf{X}^T\mathbf{Y} &= \mathbf{X}^T\mathbf{X}\mathbf{c} \\ \left(\mathbf{X}^T\mathbf{X}\right)^{-1}\mathbf{Y} &= \left(\mathbf{X}^T\mathbf{X}\right)^{-1}\mathbf{X}^T\mathbf{X}\mathbf{c} \\ \left(\mathbf{X}^T\mathbf{X}\right)^{-1}\mathbf{Y} &= \mathbf{I}\mathbf{c} \\ \left(\mathbf{X}^T\mathbf{X}\right)^{-1}\mathbf{Y} &= \mathbf{c} \end{align*}

It can be shown that the values in \displaystyle \begin{align*}\mathbf{c} \end{align*} are exactly the same as if you had minimised the sum of squared deviations.