# Equating Coefficients of Polynomials Proof

• Nov 18th 2010, 12:53 PM
sael
Equating Coefficients of Polynomials Proof
Hi,

I intuitively understand Equating Coefficients of Polynomials but I was hoping to find a proof of the method. Does anyone know where I might find one?

Cheers,

Scott.
• Nov 20th 2010, 02:12 PM
sael
I think I have figured it out.

Given say three arbitrary functions of x, $f_1(x)$, $f_2(x)$ and $f_3(x)$, the constants $a,b,c,i,j,k$ and the following equation:

$af_1(x)+bf_2(x)+cf_3(x)=if_1(x)+jf_2(x)+kf_3(x)$

Can be written:

$(a-i)f_1(x)+(b-j)f_2(x)+(c-k)f_3(x)=0$

If the functions are linearly independent then the only solution to the equation is when the coefficients of the functions are all zero. This leads to:

$a=i$
$b=j$
$c=k$

Which shows the coefficients of each side of the original equation are equated. The idea can be extended to any number of linearly independent functions.

The "x" terms of a polynomial are linearly independent so the coefficients can be equated also.
• Nov 21st 2010, 04:41 AM
HallsofIvy
Of course, to complete the proof, one would have to show that the different powers of x, 1, x, $x^2$, etc. are "independent". That means showing that $a_1+ a_2x+ a_3x^2+ \cdot\cdot\cdot+ a_{n+1}x^n= 0$ only if $a_1= a_2= a_3= \cdot\cdot\cdot= a_{n+1}= 0$.

One way to do that is to take n different values for x to get n equations for the coefficients. If we take x= 0 we get $a_1= 0$ immediately. If we take x= 0 we get [tex]a_1+ a+2+ a_3+ \cdot\cdot\cdot+ a_{n+1}= 0[/itex], etc.

A simpler, but more sophisticated, way would be to take derivatives: since $a_1+ a_2x+ a_3x^2+ \cdot\cdot\cdot+ a_{n+1}x^n= 0$ for all x, it is a constant so its derivative, $a_2+ 2a_3x+ \cdot\cdot\cdot+ na_{n+1}$ is also 0 for all x. Taking x= 0, we now get $a_2= 0$. Of course, since $a_2+ 2a_3x+ \cdot\cdot\cdot+ na_{n+1}= 0$ for all x, it is a constant and its derivative is 0. Continuing through the nth derivative gives us 0 for all n coefficients.
• Nov 21st 2010, 01:03 PM
sael
The way I see the linear independence of the polynomial terms is to take any two of them, say $ax^1$ and $bx^0$. If they were linearly dependent then there would be some a and b not zero and:

$ax^1 + bx^0=0$
$ax^1=-bx^0$
$x=-b/a$

But x is not a constant which shows a contradiction with the original assumption that the terms are linearly dependent. The only solution is when a and b are zero.

This can be applied to any two terms of a polynomial of any degree which implies that the terms of any polynomial are linearly independent.