# Thread: one to one-ness of isomorphisms between polynomial spaces

1. ## one to one-ness of isomorphisms between polynomial spaces

Hey guys showing 1-1ness of an isomorphism (supposed) i reach asub1(x)+bsub1(x^2)=asub2(x)+bsub2(x2)
now according to my Linear Algebra book two polynomials like these are equal only if their respective coefficients are equal, but if we pick x to be say 2 the asub1 can be 2 and bsub1 can be 4, where as on the right side asub2 can be 4 bsub2 cab be 3, so the coefficients are not equal. Please help, thanks

2. ## Re: one to one-ness of isomorphisms between polynomial spaces

Given two polynomials p(x) and q(x), the book's claim is that if p(x) = q(x) for all x, then the corresponding coefficients of p(x) and q(x) are the same. Of course, p(2) = q(2) is a weaker statement than p(x) = q(x) for all x (the latter implies the former but not the other way around), so it is not sufficient to guarantee that p(x) and q(x) have the same coefficients.

3. ## Re: one to one-ness of isomorphisms between polynomial spaces

polynomials are functions. so an expresion like:

p(x) = ax2 + bx + c really means:

p is the function that, when given a certain value for x, squares x and multiplies it by the constant value a, then adds that to b times the orginal value x, and then adds the constant c:

p = a*(squaring function) + b*(identity function) + c

notice in the line above, no mention of what "x" is, is made.

now, for a GIVEN x, p(x) is just a number. it is easy to confuse the function p with its value at x, p(x). but it should not be hard to see that the two FUNCTIONS:

p = a*(squaring function) + b*(identity function) + c, and
q = d*(squaring function) + e*(identity function) + f

will only be the same function if a = d, b = e, and c = f.

put another way, we can identify p and q with the graphs of x vs. p(x) and x vs. q(x). even though the two graphs may have points in common (that is, may match for certain x's), they are not the same graph unless they match at every single point (that is, for ALL x's).

4. ## Re: one to one-ness of isomorphisms between polynomial spaces

so tell me if i got this right: although polynomials at some specific x might be equal without the coefficients being equal, for the the polynomials to be equal everywhere(thinking graphically) the coefficients must be the same. If so, does this apply to this example also?: asub0sin(x)+asub1cos(x)=bsub0sin(x)+bsub1cos(x) at x = pi/4 the coefficients dont necessarily have to be equal, but for them to be equal for any x the coefficients have to be equal

5. ## Re: one to one-ness of isomorphisms between polynomial spaces

Originally Posted by siryog90
so tell me if i got this right: although polynomials at some specific x might be equal without the coefficients being equal, for the the polynomials to be equal everywhere(thinking graphically) the coefficients must be the same.
Yes, this is correct.

Originally Posted by siryog90
If so, does this apply to this example also?: asub0sin(x)+asub1cos(x)=bsub0sin(x)+bsub1cos(x) at x = pi/4 the coefficients dont necessarily have to be equal, but for them to be equal for any x the coefficients have to be equal
First, you can write $$a_0\sin(x)+a_1\cos(x)$$ to get $\displaystyle a_0\sin(x)+a_1\cos(x)$. See the sticky threads in the LaTeX Help subforum for more information. In plain text, it is customary to write a_0 sin(x) or a_0 * sin(x) or just a0 sin(x).

Yes, $\displaystyle a_0\sin(x)+a_1\cos(x)=b_0\sin(x)+b_1\cos(x)$ for all x implies $\displaystyle a_0=b_0$ and $\displaystyle a_1=b_1$. In general, if you have linearly independent vectors (see below) $\displaystyle e_1,\dots,e_n$, then

for any numbers $\displaystyle a_1,\dots,a_n,b_1\dots,b_n$, if $\displaystyle a_1e_1+\dots+a_ne_n=b_1e_1+\dots+b_ne_n$, then $\displaystyle a_1=b_1, ..., a_n=b_n$ (*)

Here "vectors" does not necessarily means two- or three-dimensional Euclidean vectors, but anything that can be added and multiplied by a number. For example, functions from real numbers to real numbers are vectors. Note that, as Deveno wrote, you multiply a function as a whole, not just its value at one particular point. E.g., if $\displaystyle f:x\mapsto x^2$ is a function, then $\displaystyle 5f$ is also a function defined for all x, namely, $\displaystyle 5f:x\mapsto5x^2$.

A set $\displaystyle \{e_1,\dots,e_n\}$ of vectors is linearly independent if

for all numbers $\displaystyle a_1,\dots,a_n, a_1e_1+\dots+a_ne_n=0$ implies $\displaystyle a_1=\dots =a_n=0$ (**)

For example, 1, x, ..., $\displaystyle x^n$ are linearly independent: e.g., $\displaystyle a_0+a_1x+a_2x^2=0$ for all x implies $\displaystyle a_0=a_1=a_2=0$. Also, sin(x) and cos(x) are linearly independent. Indeed, suppose $\displaystyle a\sin(x)+b\cos(x)=0$ for all x. Then $\displaystyle \tan(x)=-b/a$ for all x, a contradiction. In fact, the whole set sin(x), sin(2x), ..., cos(x), cos(2x), ... are linearly independent, which is very important for Fourier analysis.

You can show that (**) implies (*), which is not hard and answers affirmatively your second question. For more information, see Vector space and Linear independence in Wikipedia.

6. ## Re: one to one-ness of isomorphisms between polynomial spaces

got it, thanks a lot