I am reading Nicholson: Introduction to Abstract Algebra, Section 6.3 Splitting Fields.

Example 1 reads as follows: (see attachment)

--------------------------------------------------------------------------------------------------

Find an extension $\displaystyle E \supseteq \mathbb{Z}_2 $ in which $\displaystyle f(x) = x^3 + x + 1 $ factors completely into linear factors.Example 1.

--------------------------------------------------------------------------------------------------

The solution reads as follows:

-------------------------------------------------------------------------------------------------

The polynomial f(x) is irreducible over $\displaystyle \mathbb{Z}_2 $ (it has no root in $\displaystyle \mathbb{Z}_2 $ ) soSolution.

$\displaystyle E = \{ a_0 + a_1 t + a_2 t^2 \ | \ a_i \in \mathbb{Z}_2 , f(t) = 0 \} $

is a field containing a root t of f(x).

Hence x + t = x - t is a factor of f(x)

The division algorithm gives $\displaystyle f(x) = (x+t) g(x) $ where $\displaystyle g(x) = x^2 + tx + (1 + t^2) $

, so it suffices to show that g(x) also factors completely in E.

Trial and error give $\displaystyle g(t^2) = 0 $ so $\displaystyle g(x) = (x + t^2)(x + v) $ for some $\displaystyle v \in F$.

... ... etc (see attachment)

-------------------------------------------------------------------------------------------------------------

My problem is that I cannot show how $\displaystyle g(t^2) = 0 $ implies that $\displaystyle g(x) = (x + t^2)(x + v) $ for some $\displaystyle v \in F$.

I would appreciate some help.

Peter