# polynomial rings

• Feb 27th 2009, 08:22 PM
dori1123
polynomial rings
Let $\displaystyle f(x)$ be a polynomial in $\displaystyle \mathbb{Z}[x]$. Prove that if $\displaystyle f(x)$ has a root $\displaystyle \alpha \in \mathbb{Z}$, then $\displaystyle f(x)$ has a linear factor in $\displaystyle \mathbb{Z}[x]$.

Suppose $\displaystyle f(x)$ has a root $\displaystyle \alpha \in \mathbb{Z}$, then $\displaystyle f(x)$ is reducible, and then $\displaystyle f(x) = a(x)(x - \alpha)$ for some nonconstant polynomial $\displaystyle a(x) \in \mathbb{Z}[x]$. So $\displaystyle x - \alpha$ is a linear factor in $\displaystyle \mathbb{Z}[x]$.

Is this correct? Did I miss anything important in this proof?
• Feb 28th 2009, 12:01 AM
kalagota
Quote:

Originally Posted by dori1123
Let $\displaystyle f(x)$ be a polynomial in $\displaystyle \mathbb{Z}[x]$. Prove that if $\displaystyle f(x)$ has a root $\displaystyle \alpha \in \mathbb{Z}$, then $\displaystyle f(x)$ has a linear factor in $\displaystyle \mathbb{Z}[x]$.

Suppose $\displaystyle f(x)$ has a root $\displaystyle \alpha \in \mathbb{Z}$, then $\displaystyle f(x)$ is reducible, and then $\displaystyle f(x) = a(x)(x - \alpha)$ for some nonconstant polynomial $\displaystyle a(x) \in \mathbb{Z}[x]$. So $\displaystyle x - \alpha$ is a linear factor in $\displaystyle \mathbb{Z}[x]$.

Is this correct? Did I miss anything important in this proof?

there is no assumption on the degree of $\displaystyle f$ so you cannot say that $\displaystyle a(x)$ is non constant..

well, this is a direct consequence of the division algorithm. since you think that $\displaystyle x-\alpha$ is the linear factor, then try writing $\displaystyle f(x)=g(x)(x-\alpha) + R$ where $\displaystyle R$ is a non zero constant or $\displaystyle R=0$.
• Feb 28th 2009, 12:35 PM
dori1123
Quote:

Originally Posted by kalagota
there is no assumption on the degree of $\displaystyle f$ so you cannot say that $\displaystyle a(x)$ is non constant..

well, this is a direct consequence of the division algorithm. since you think that $\displaystyle x-\alpha$ is the linear factor, then try writing $\displaystyle f(x)=g(x)(x-\alpha) + R$ where $\displaystyle R$ is a non zero constant or $\displaystyle R=0$.

But $\displaystyle \mathbb{Z}[x]$ is not a field, there's no Euclidean Algorithm...
• Feb 28th 2009, 03:29 PM
ThePerfectHacker
Quote:

Originally Posted by dori1123
But $\displaystyle \mathbb{Z}[x]$ is not a field, there's no Euclidean Algorithm...

The division algorithm does work for non-fields too, $\displaystyle \mathbb{Z}[x]$ is an example. (Wink)
• Feb 28th 2009, 05:04 PM
kalagota
another example is $\displaystyle \mathbb{Z}$ itself.. :)
• Feb 28th 2009, 05:31 PM
dori1123
Suppose $\displaystyle f(x) \in \mathbb{Z}[x]$ has a root $\displaystyle \alpha \in \mathbb{Z}$. By the division algorithm, $\displaystyle f(x) = q(x)(x-\alpha)+r$ for some polynomial $\displaystyle q(x) \in \mathbb{Z}[x]$ and for some $\displaystyle r \in \mathbb{Z}$. Since $\displaystyle f(\alpha)=0$, we have $\displaystyle r=0$. So $\displaystyle f(x)=q(x)(x-\alpha)$ and so $\displaystyle x-\alpha$ is a linear factor of $\displaystyle f(x) \in \mathbb{Z}[x]$. Is this correct?

Also, for the converse to be true, if $\displaystyle f(x) \in \mathbb{Z}[x]$ has a linear factor, then the linear factor must be a monic polynomial to guarantee that $\displaystyle f(x)$ has a root in $\displaystyle \mathbb{Z}$. Right? That's the only condition I can think of.
• Feb 28th 2009, 07:12 PM
ThePerfectHacker
Quote:

Originally Posted by dori1123

Also, for the converse to be true, if $\displaystyle f(x) \in \mathbb{Z}[x]$ has a linear factor, then the linear factor must be a monic polynomial to guarantee that $\displaystyle f(x)$ has a root in $\displaystyle \mathbb{Z}$. Right? That's the only condition I can think of.

If $\displaystyle f(x) = (x-a)g(x)$ then $\displaystyle f(a) = 0$, so $\displaystyle a$ is a zero.
• Feb 28th 2009, 08:26 PM
dori1123
Quote:

Originally Posted by ThePerfectHacker
If $\displaystyle f(x) = (x-a)g(x)$ then $\displaystyle f(a) = 0$, so $\displaystyle a$ is a zero.

I understand that the converse is true if $\displaystyle f(a)=0$ but why is $\displaystyle a$ zero?
• Feb 28th 2009, 08:44 PM
ThePerfectHacker
Quote:

Originally Posted by dori1123
I understand that the converse is true if $\displaystyle f(a)=0$ but why is $\displaystyle a$ zero?

No, $\displaystyle a$ is not zero, $\displaystyle a$ is a zero of $\displaystyle f$. This means $\displaystyle f(a) = 0$.
• Feb 28th 2009, 09:42 PM
dori1123
Quote:

Originally Posted by ThePerfectHacker
No, $\displaystyle a$ is not zero, $\displaystyle a$ is a zero of $\displaystyle f$. This means $\displaystyle f(a) = 0$.

Oh, you mean a root. Sorry, I got used to say roots.