Inverse Matrix - Eigenvalues & Eigenvectors (Proof Needed)

Hello,

One of the questions I was presented in a tutorial the other day really stumped me and I am unsure as to how to prove it.

The question is:

"Suppose that A is an invertible matrix and that x is an eigenvector for A with eigenvalue 'lambda cannot equal 0'. Show that x is an eigenvector for the inverse matrix of A with eigenvalue 'lambda^{-1}'.

I have been shown a similar question where you had to prove that the matrix A^{2} had an eigenvalue of lambda^{2} through the manipulation of the equation Ax = (lambda)x as below:

Ax = (lambda)x

A(Ax) = A(lambda)x

A^{2}x = (lambda)Ax

A^{2}x = (lambda)(lambda)x

A^{2}x = (lambda)^{2}x

Is the solution to this question reached through the same steps, or is there a step I need to do differently?

Any help would be greatly appreciated :)

Re: Inverse Matrix - Eigenvalues & Eigenvectors (Proof Needed)

Perhaps you should attempt this question before asking for help. Don't be afraid to try.

$\displaystyle Ax=\lambda x \iff A^{-1}Ax=A^{-1}\lambda x \iff x=\lambda A^{-1}x$

Re: Inverse Matrix - Eigenvalues & Eigenvectors (Proof Needed)

Let $\displaystyle A $ be a matrix with eigenvalue $\displaystyle \lambda $ and eigenvector $\displaystyle x $.

Then $\displaystyle Ax = \lambda x$

Since $\displaystyle A $ is invertible, we can multiply both sides by $\displaystyle A^{-1} $.

$\displaystyle A^{-1} Ax = A^{-1}\lambda x $

$\displaystyle x = A^{-1} \lambda x $

Solve for $\displaystyle A^{-1} x$.

Thus, $\displaystyle A^{-1} x = \frac{1}{\lambda} x$