Originally Posted by
HallsofIvy Apparently, you are looking for the "least squares" cubic that goes "closest" to those points.
Think of it this way: you are looking for a, b, c, d such that every point $\displaystyle (x_1, y_1), (x_2, y_2), \cdot\cdot\cdot, (x_n, y_n)$ satisfy
$\displaystyle y_1= ax_1^3+ bx_1^2+ cx_1+ d$,
$\displaystyle y_2= ax_2^3+ bx_2^2+ cx_2+ d$
$\displaystyle \cdot\cdot\cdot$
$\displaystyle y_n= ax_n^3+ bx_n^2+ cx_n+ d$
We can write that as the matrix equation, Ax= b:
$\displaystyle \begin{bmatrix}x_1^3 & x_1^2 & x_1 & 1 \\ x_2^3 & x_2^2 & x_2 & 1 \\\cdot & \cdot & \cdot & \cdot \\\cdot & \cdot & \cdot & \cdot \\\cdot & \cdot & \cdot & \cdot \\x_n^3 & x_n^2 & x_n & 1\end{bmatrix}$$\displaystyle \begin{bmatrix}a \\ b \\ c \\ d\end{bmatrix}= \begin{bmatrix} y_1 \\ y_2 \\ \cdot\\\cdot\\\cdot\\ y_n\end{bmatrix}$.
Now A is a linear transformation from $\displaystyle R^4$ to $\displaystyle R^n$ and so maps $\displaystyle R^4$ into a 4 dimensional subspace of $\displaystyle R^n$. That equation has a solution only if b happens to lie in that subspace (if the points happen to lie on the graph of a cubic). If not then the best we can do is to find the x in $\displaystyle R^4$ that is closest to b. That would mean that if x is such, the vector from b to Ax, Ax- b, must be perpendicular to that subspace and so perpendicular to every vector in it. Since every vector in that subspace can be written as Av for some v in $\displaystyle R^4$ we must have <Av, Ax- b>= 0 where "< , >" is the inner (dot) product in $\displaystyle R^n$.
Now, some theory. If A is a linear transformation from vector space U to vector space V, then its "adjoint", A*, is the linear transformation from V back to U such that, for all u in U, v in V, <Au, v>= <u, A*v> where the inner product in V and on the right in U. Here, <Av, Ax-b>= <v, A*(Ax-b)>= 0. But now that inner product on the right is in $\displaystyle R^4$ and v can be any vector in $\displaystyle R^4$ since the inner product of A*(Ax- b) with any vector in $\displaystyle R^4$ (in particular with itself) is 0, we must have A*(Ax- b)= A*Ax- A*b= 0 or A*Ax= A*b and so, if the inverse exists, $\displaystyle x= (A^*A)^{-1}A^*b$.
$\displaystyle (A^*A)^{-1}A^*$ (if it exists)- is the "generalized inverse" of A. It can be shown that if A has an inverse, so does A* and $\displaystyle (A*A)^{-1}= A^{-1}A*^{-1}$ so that $\displaystyle (A*A)^{-1}A*= A^{-1}A*^{-1}A*= A^{-1}$. But even if A does not have an inverse (if it is not square as in this case) (A*A)^{-1} may exist.
Here, $\displaystyle A= \begin{bmatrix}x_1^3 & x_1^2 & x_1 & 1 \\ x_2^3 & x_2^2 & x_2 & 1 \\\cdot & \cdot & \cdot & \cdot \\\cdot & \cdot & \cdot & \cdot \\\cdot & \cdot & \cdot & \cdot \\x_n^3 & x_n^2 & x_n & 1\end{bmatrix}$ so $\displaystyle A*= \begin{bmatrix}x_1^3 & x_2^3 & \cdot\cdot\cdot\ x_n^3 \\ x_1^2 & x_2^2 & \cdot\cdot\cdot x_n^2 \\ x_1 & x_2 & \cdot\cdot\cdot x_n \\ 1 & 1 & \cdot\cdot\cdot 1 \end{bmatrix}$
and that $\displaystyle A^*A= \begin{bmatrix}\sum x_n^6 & \sum x_n^5 & \sum x_n^4 & \sum x_n^3 \\ \sum x_n^5 & \sum x_n^4 & \sum x_n^3 & \sum x_n^2 \\ \sum x_n^4 & \sum x_n^3 & \sum x_n^2 & \sum x_n \\ \sum x_n^3 & \sum x_n^2 & \sum x_n & n\end{bmatrix}$.
Also $\displaystyle A^*b= \begin{bmatrix}\sum x_n^3y_n \\ \sum x_n^2y_n \\ \sum x_ny_n \\ \sum y_n\end{bmatrix}$
Now, the tedious part is to find the inverse of that 4 by 4 matrix. That's going to involve some really messy formulas that I don't want to have to calculate! But once you have done that, a, b, c, and d are given by the components of $\displaystyle [(A^*A)^{-1}][A^* b]$.