One of the questions I was presented in a tutorial the other day really stumped me and I am unsure as to how to prove it.
The question is:
"Suppose that A is an invertible matrix and that x is an eigenvector for A with eigenvalue 'lambda cannot equal 0'. Show that x is an eigenvector for the inverse matrix of A with eigenvalue 'lambda-1'.
I have been shown a similar question where you had to prove that the matrix A2 had an eigenvalue of lambda2 through the manipulation of the equation Ax = (lambda)x as below:
Ax = (lambda)x
A(Ax) = A(lambda)x
A2x = (lambda)Ax
A2x = (lambda)(lambda)x
A2x = (lambda)2x
Is the solution to this question reached through the same steps, or is there a step I need to do differently?
Any help would be greatly appreciated