Apparently, you are looking for the "least squares" cubic that goes "closest" to those points.

Think of it this way: you are looking for a, b, c, d such that every point

satisfy

,

We can write that as the matrix equation, Ax= b:

.

Now A is a linear transformation from

to

and so maps

into a 4 dimensional

**subspace** of

. That equation has a solution only if b happens to lie in that subspace (if the points happen to lie on the graph of a cubic). If not then the best we can do is to find the x in

that is

**closest** to b. That would mean that if x is such, the vector from b to Ax, Ax- b, must be perpendicular to that subspace and so perpendicular to every vector in it. Since every vector in that subspace can be written as Av for some v in

we must have <Av, Ax- b>= 0 where "< , >" is the inner (dot) product in

.

Now, some theory. If A is a linear transformation from vector space U to vector space V, then its "adjoint", A*, is the linear transformation from V back to U such that, for all u in U, v in V, <Au, v>= <u, A*v> where the inner product in V and on the right in U. Here, <Av, Ax-b>= <v, A*(Ax-b)>= 0. But now that inner product on the right is in

and v can be any vector in

since the inner product of A*(Ax- b) with

**any** vector in

(in particular with itself) is 0, we must have A*(Ax- b)= A*Ax- A*b= 0 or A*Ax= A*b and so, if the inverse exists,

.

(if it exists)- is the "generalized inverse" of A. It can be shown that if A has an inverse, so does A* and

so that

. But even if A does not have an inverse (if it is not square as in this case) (A*A)^{-1} may exist.

Here,

so

and that

.

Also

Now, the tedious part is to find the inverse of that 4 by 4 matrix. That's going to involve some really messy formulas that I don't want to have to calculate! But once you have done that, a, b, c, and d are given by the components of

.