1. ## Matrix Polynomial

So here is a question I have been working on, any suggestions would be great.

If $\displaystyle B$ is an $\displaystyle n x n$ matrix with enteries in $\displaystyle F$, then prove that there is a nonzero polynomial $\displaystyle p \in F[t]$ which has $\displaystyle B$ as a root.

So here is what I have so far. There must exist $\displaystyle a_0,..., a_r$ not all 0 such that

$\displaystyle p(t) = a_0 + a_1t + ... + a_rt^r$

$\displaystyle p(B) = a_0I + a_1B + ... + a_rB^r = 0$

So pretty much from this point we just need to show that the set $\displaystyle I,B,B^2,...,B^r$ is linearly dependent. Any advice on how to go about doing this?

2. $\displaystyle B=SDS^{-1}=S\begin{bmatrix} \ddots & & \\ & \lambda_{i} & \\ & & \ddots \\ \end{bmatrix}S^{-1}~,~ p(B)=a_0I + a_1B + \cdots + a_rB^r$

$\displaystyle =S\begin{bmatrix} \ddots & & \\ & a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} & \\ & & \ddots \\ \end{bmatrix}S^{-1}=0$

$\displaystyle \rightarrow a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} = 0 \\ \rightarrow p(t)=k\cdot \det(tI_n-B)~,~k\in F$

3. Originally Posted by math2009
$\displaystyle B=SDS^{-1}=S\begin{bmatrix} \ddots & & \\ & \lambda_{i} & \\ & & \ddots \\ \end{bmatrix}S^{-1}~,~ p(B)=a_0I + a_1B + \cdots + a_rB^r$

$\displaystyle =S\begin{bmatrix} \ddots & & \\ & a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} & \\ & & \ddots \\ \end{bmatrix}S^{-1}=0$

$\displaystyle \rightarrow a_0 + a_1\lambda_{i} + \cdots + a_r\lambda_{i}^{r} = 0 \\ \rightarrow p(t)=k\cdot \det(tI_n-B)~,~k\in F$
You seem to be assuming that $\displaystyle F$ is algebraically closed.

4. Thank you both very much for your help, however I am still a bit confused. We have not studied determinants (except briefly for 2x2 matrices). Would it be possible for you to give a short explanation of your proof?

The wikipedia page dealt a lot with determinants, eigenvectors and eigenvalues, all of which we have not yet been taught, so I found the proofs hard to follow.

Thanks!

5. Okay; let's do it without the machinery.

Recall that $\displaystyle F_{n\times n}$ is a vector space of degree $\displaystyle n^2$ over $\displaystyle F$. Thus any $\displaystyle n^2+1$ elements of $\displaystyle F_{n\times n}$ are linearly dependent. So consider the set $\displaystyle \{B, B^2, \dots, B^{n^2+1}\}$... I think I've said enough!

By the way, the Latex symbol for $\displaystyle \times$ is \times.

6. Thank you very much, I got it!

And thanks for the tip, im pretty new to latex.