My method is:Suppose that A is a square matrix and is an eigenvalue of A.
(i) Show that is an eigenvalue of for all positive integers n.
(ii) Suppose A is invertible. Show is non-zero and that is an eigenvalue of
Let
Since ,
I need to show that
which is what I needed to show.
Is this right?
For part 2:
If A is invertible then exists.
Like before .
Hence:
This is where my method goes awry. cannot equal zero or the matrix A is not invertible.
What am I doing wrong??
This implies that . I thought that if the matrix A was multiplied by an eigenvector then only a multiple of this eigenvector was produced. How do you know that the scalar in front of the eigenvector will be it's eigenvalue? Since you know that it's going to be a multiple of the eigenvector, did you just decide to make this the eigenvalue since the eigenvector can just be scaled down?
Every eigen value has at least one eigen vector. If you read what I wrote you will see that I said that was an eigen vector corresponding to the eigen value of .
Then by definition:
You may have a different convention about this, the other one has:
but the argument will still work with a little adaptation
CB