Hi,

I intuitively understand Equating Coefficients of Polynomials but I was hoping to find a proof of the method. Does anyone know where I might find one?

Cheers,

Scott.

Printable View

- Nov 18th 2010, 11:53 AMsaelEquating Coefficients of Polynomials Proof
Hi,

I intuitively understand Equating Coefficients of Polynomials but I was hoping to find a proof of the method. Does anyone know where I might find one?

Cheers,

Scott. - Nov 20th 2010, 01:12 PMsael
I think I have figured it out.

Given say three arbitrary functions of x, $\displaystyle f_1(x)$, $\displaystyle f_2(x)$ and $\displaystyle f_3(x)$, the constants $\displaystyle a,b,c,i,j,k$ and the following equation:

$\displaystyle af_1(x)+bf_2(x)+cf_3(x)=if_1(x)+jf_2(x)+kf_3(x)$

Can be written:

$\displaystyle (a-i)f_1(x)+(b-j)f_2(x)+(c-k)f_3(x)=0$

If the functions are linearly independent then the only solution to the equation is when the coefficients of the functions are all zero. This leads to:

$\displaystyle a=i$$\displaystyle b=j$$\displaystyle c=k$

Which shows the coefficients of each side of the original equation are equated. The idea can be extended to any number of linearly independent functions.

The "x" terms of a polynomial are linearly independent so the coefficients can be equated also. - Nov 21st 2010, 03:41 AMHallsofIvy
Of course, to complete the proof, one would have to show that the different powers of x, 1, x, [itex]x^2[/itex], etc. are "independent". That means showing that $\displaystyle a_1+ a_2x+ a_3x^2+ \cdot\cdot\cdot+ a_{n+1}x^n= 0$ only if $\displaystyle a_1= a_2= a_3= \cdot\cdot\cdot= a_{n+1}= 0$.

One way to do that is to take n different values for x to get n equations for the coefficients. If we take x= 0 we get $\displaystyle a_1= 0$ immediately. If we take x= 0 we get [tex]a_1+ a+2+ a_3+ \cdot\cdot\cdot+ a_{n+1}= 0[/itex], etc.

A simpler, but more sophisticated, way would be to take derivatives: since $\displaystyle a_1+ a_2x+ a_3x^2+ \cdot\cdot\cdot+ a_{n+1}x^n= 0$ for all x, it is a constant so its derivative, $\displaystyle a_2+ 2a_3x+ \cdot\cdot\cdot+ na_{n+1}$ is also 0 for all x. Taking x= 0, we now get $\displaystyle a_2= 0$. Of course, since $\displaystyle a_2+ 2a_3x+ \cdot\cdot\cdot+ na_{n+1}= 0$ for all x, it is a constant and**its**derivative is 0. Continuing through the nth derivative gives us 0 for all n coefficients. - Nov 21st 2010, 12:03 PMsael
The way I see the linear independence of the polynomial terms is to take any two of them, say $\displaystyle ax^1$ and $\displaystyle bx^0$. If they were linearly dependent then there would be some a and b not zero and:

$\displaystyle ax^1 + bx^0=0$$\displaystyle ax^1=-bx^0$$\displaystyle x=-b/a$

But x is not a constant which shows a contradiction with the original assumption that the terms are linearly dependent. The only solution is when a and b are zero.

This can be applied to any two terms of a polynomial of any degree which implies that the terms of any polynomial are linearly independent.