I am in an Algebra II course where we talk about function fitting with polynomials. We use a method called finite differences to find the degree polynomial that fits the given data (x and y values).
What makes finite differences tick? Why does it work?
What makes it work is that the differences of a polynomial of degree n are the values of a polynomial of degree n-1, so if you keep taking the differences of differences you get polynomials of decreasing degree, eventually arriving at a polynomial of degree 0 which is a constant. The number of differences you need to take to arrive at a constant is the degree of the original polynomial.
Originally Posted by masoug