# Matrix Polynomial

• Feb 8th 2010, 04:40 PM
joe909
Matrix Polynomial
So here is a question I have been working on, any suggestions would be great.

If $B$ is an $n x n$ matrix with enteries in $F$, then prove that there is a nonzero polynomial $p \in F[t]$ which has $B$ as a root.

So here is what I have so far. There must exist $a_0,..., a_r$ not all 0 such that

$p(t) = a_0 + a_1t + ... + a_rt^r$

$p(B) = a_0I + a_1B + ... + a_rB^r = 0$

So pretty much from this point we just need to show that the set $I,B,B^2,...,B^r$ is linearly dependent. Any advice on how to go about doing this?
• Feb 8th 2010, 05:01 PM
Bruno J.
Use the Cayley-Hamilton Theorem! (Smirk)
• Feb 9th 2010, 04:57 AM
math2009
$

B=SDS^{-1}=S\begin{bmatrix} \ddots & & \\
& \lambda_{i} & \\
& & \ddots \\
\end{bmatrix}S^{-1}~,~
p(B)=a_0I + a_1B + \cdots + a_rB^r
$

$
=S\begin{bmatrix}
\ddots & & \\
& a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} & \\
& & \ddots \\
\end{bmatrix}S^{-1}=0
$

$
\rightarrow a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} = 0 \\
\rightarrow p(t)=k\cdot \det(tI_n-B)~,~k\in F

$
• Feb 9th 2010, 08:16 AM
Bruno J.
Quote:

Originally Posted by math2009
$

B=SDS^{-1}=S\begin{bmatrix} \ddots & & \\
& \lambda_{i} & \\
& & \ddots \\
\end{bmatrix}S^{-1}~,~
p(B)=a_0I + a_1B + \cdots + a_rB^r
$

$
=S\begin{bmatrix}
\ddots & & \\
& a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} & \\
& & \ddots \\
\end{bmatrix}S^{-1}=0
$

$
\rightarrow a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} = 0 \\
\rightarrow p(t)=k\cdot \det(tI_n-B)~,~k\in F

$

You seem to be assuming that $F$ is algebraically closed.
• Feb 9th 2010, 12:38 PM
joe909
Thank you both very much for your help, however I am still a bit confused. We have not studied determinants (except briefly for 2x2 matrices). Would it be possible for you to give a short explanation of your proof?

The wikipedia page dealt a lot with determinants, eigenvectors and eigenvalues, all of which we have not yet been taught, so I found the proofs hard to follow.

Thanks!
• Feb 9th 2010, 01:38 PM
Bruno J.
Okay; let's do it without the machinery.

Recall that $F_{n\times n}$ is a vector space of degree $n^2$ over $F$. Thus any $n^2+1$ elements of $F_{n\times n}$ are linearly dependent. So consider the set $\{B, B^2, \dots, B^{n^2+1}\}$... I think I've said enough!

By the way, the Latex symbol for $\times$ is \times. (Cool)
• Feb 9th 2010, 02:48 PM
joe909
Thank you very much, I got it! :)

And thanks for the tip, im pretty new to latex.