Hi,
I intuitively understand Equating Coefficients of Polynomials but I was hoping to find a proof of the method. Does anyone know where I might find one?
Cheers,
Scott.
I think I have figured it out.
Given say three arbitrary functions of x, , and , the constants and the following equation:
Can be written:
If the functions are linearly independent then the only solution to the equation is when the coefficients of the functions are all zero. This leads to:
Which shows the coefficients of each side of the original equation are equated. The idea can be extended to any number of linearly independent functions.
The "x" terms of a polynomial are linearly independent so the coefficients can be equated also.
Of course, to complete the proof, one would have to show that the different powers of x, 1, x, [itex]x^2[/itex], etc. are "independent". That means showing that only if .
One way to do that is to take n different values for x to get n equations for the coefficients. If we take x= 0 we get immediately. If we take x= 0 we get [tex]a_1+ a+2+ a_3+ \cdot\cdot\cdot+ a_{n+1}= 0[/itex], etc.
A simpler, but more sophisticated, way would be to take derivatives: since for all x, it is a constant so its derivative, is also 0 for all x. Taking x= 0, we now get . Of course, since for all x, it is a constant and its derivative is 0. Continuing through the nth derivative gives us 0 for all n coefficients.
The way I see the linear independence of the polynomial terms is to take any two of them, say and . If they were linearly dependent then there would be some a and b not zero and:
But x is not a constant which shows a contradiction with the original assumption that the terms are linearly dependent. The only solution is when a and b are zero.
This can be applied to any two terms of a polynomial of any degree which implies that the terms of any polynomial are linearly independent.