Matrices - eigenvalues

May 2010
27
0
I have a question here relating to getting the eigenvalues of a matrix.
The matrix is 3 x 3. Im just wondering that when you are getting the determinant of
det(I*Lambda - A). Is it ok to row reduce the matrix A first before subbing into this equation?
 

Krizalid

MHF Hall of Honor
Mar 2007
3,656
1,699
Santiago, Chile
no because for any elementary row operation you want to do, doesn't matter when reducing the matrix into the echelon form, but it does affect the determinant, so it needs to be computed the determinant given the "intact" matrix.
 

dwsmith

MHF Hall of Honor
Mar 2010
3,093
582
Florida
I have a question here relating to getting the eigenvalues of a matrix.
The matrix is 3 x 3. Im just wondering that when you are getting the determinant of
det(I*Lambda - A). Is it ok to row reduce the matrix A first before subbing into this equation?
No. If your matrix reduces to the identity matrix, then the eigenvalues will all be 1 when in fact the eigenvalues maybe be 2,3,-2,....
 
  • Like
Reactions: frog
May 2010
27
0
Cool..(forgot you could completely reduce it to the the identity!)
 

dwsmith

MHF Hall of Honor
Mar 2010
3,093
582
Florida
Cool..(forgot you could completely reduce it to the the identity!)
Singular matrices don't reduce to the identity. That was just my example since it would be easy to understand.