1. ## Eigenvalues

Suppose that A is a square matrix and $\displaystyle \lambda$ is an eigenvalue of A.

(i) Show that $\displaystyle \lambda ^n$ is an eigenvalue of $\displaystyle A^n$ for all positive integers n.

(ii) Suppose A is invertible. Show $\displaystyle \lambda$ is non-zero and that $\displaystyle \lambda ^{-1}$ is an eigenvalue of $\displaystyle A^{-1}$
My method is:

Let $\displaystyle |A|=f(x)$

Since $\displaystyle \lambda$, $\displaystyle f(\lambda)=0$

I need to show that $\displaystyle f(\lambda ^{n})= 0$

$\displaystyle f ( \lambda)=0$
$\displaystyle (f(\lambda))^n=0^n$
$\displaystyle (f(\lambda)^n)=f(\lambda ^n)=0$ which is what I needed to show.

Is this right?

For part 2:

If A is invertible then $\displaystyle A^{-1}$ exists.

Like before $\displaystyle f(\lambda)=0$.

$\displaystyle A=\begin{pmatrix} {a_{11}}&{.....}&{a_{1n}}\\ {...}&{...}&{....}\\ {a_{n1}}&{....}&{a_{nn}} \end{pmatrix}$

$\displaystyle Det(A- \lambda I)=0$

Hence:

$\displaystyle \begin{vmatrix} {a_{11}}&{....}&{a_{1n}}\\ {....}&{....}&{....}\\ {a_{n1}}&{....}&{a_{nn}} \end{vmatrix}-\lambda I=\begin{vmatrix} {a_{11}-\lambda}&{....}&{a_{1n}}\\ {....}&{....}&{....}\\ {a_{n1}}&{....}&{a_{nn}-\lambda} \end{vmatrix}=f(\lambda)=0$

This is where my method goes awry. $\displaystyle f(\lambda)$ cannot equal zero or the matrix A is not invertible.

What am I doing wrong??

2. Originally Posted by Showcase_22
My method is:

Let $\displaystyle |A|=f(x)$

Since $\displaystyle \lambda$, $\displaystyle f(\lambda)=0$

I need to show that $\displaystyle f(\lambda ^{n})= 0$

$\displaystyle f ( \lambda)=0$
$\displaystyle (f(\lambda))^n=0^n$
$\displaystyle (f(\lambda)^n)=f(\lambda ^n)=0$ which is what I needed to show.

Is this right?
Let $\displaystyle x$ be an eigen vector corresponding to $\displaystyle \lambda$, then:

$\displaystyle A^nx=A^{n-1}Ax=\lambda A^{n-1}x$

Which can be used as the basic idea of proof by induction of the following theorem:

Let $\displaystyle \lambda$ be an eigen value of square matrix $\displaystyle A$ with eigen vector $\displaystyle x$, then $\displaystyle \lambda^n$ is an eigen value of $\displaystyle A^n$ with eigen vector $\displaystyle x$.

CB

3. Originally Posted by CaptainBlack
Let $\displaystyle x$ be an eigen vector corresponding to $\displaystyle \lambda$, then:

$\displaystyle A^nx=A^{n-1}Ax=\lambda A^{n-1}x$

Which can be used as the basic idea of proof by induction of the following theorem:

Let $\displaystyle \lambda$ be an eigen value of square matrix $\displaystyle A$ with eigen vector $\displaystyle x$, then $\displaystyle \lambda^n$ is an eigen value of $\displaystyle A^n$ with eigen vector $\displaystyle x$.

CB
$\displaystyle A^nx=A^{n-1}Ax=\lambda A^{n-1}x$

This implies that $\displaystyle Ax=\lambda x$. I thought that if the matrix A was multiplied by an eigenvector then only a multiple of this eigenvector was produced. How do you know that the scalar in front of the eigenvector will be it's eigenvalue? Since you know that it's going to be a multiple of the eigenvector, did you just decide to make this the eigenvalue since the eigenvector can just be scaled down?

4. Originally Posted by Showcase_22
$\displaystyle A^nx=A^{n-1}Ax=\lambda A^{n-1}x$

This implies that $\displaystyle Ax=\lambda x$. I thought that if the matrix A was multiplied by an eigenvector then only a multiple of this eigenvector was produced. How do you know that the scalar in front of the eigenvector will be it's eigenvalue? Since you know that it's going to be a multiple of the eigenvector, did you just decide to make this the eigenvalue since the eigenvector can just be scaled down?
Every eigen value has at least one eigen vector. If you read what I wrote you will see that I said that $\displaystyle x$ was an eigen vector corresponding to the eigen value $\displaystyle \lambda$ of $\displaystyle A$.

Then by definition:

$\displaystyle Ax=\lambda x$

$\displaystyle xA=\lambda x$