Hey grandmarquis84.
What is the criteria for diagonalization (in terms of the eigen-vectors and eigen-values)?
The matrix is
B=
3 -1
1 5
and I hate to show that it's not diagonalizable . I'm not really sure where to begin. I think the determinant (which I know equals 16) has something to do with it but I'm not sue where to go from there. I also need to reverse the rows of B and find both the eigenvalues and eigenvectors for this new matrix, as well as the e-vals and e-vectors for B traspose.
Yes, we've learned to find eigenvectors and eigenvalues, it's diagonalization that I don't understand. I know that the sum of the trace is an eigenvalue and to find its corresponding eigenvector would involve subtracting the eigenvalue from each term along to trace and then finding the vector that when multiplied by the matrix gives the zero vector.
https://www.math.okstate.edu/~binega...9/3013-l16.pdf
basically if you have a matrix S with the eigenvectors of a matrix A as columns, then D = S^{-1}AS is diagonal and the diagonal element D_{ii} is eigenvalue corresponding to the ith column eigenvector.
With regards to diagonalization, recall that the eigenvectors are the vectors that don't change when applied to the original matrix since Ax = lx where x is an eigenvector and l is an eigenvalue.
What this means is that you have found a vector where the matrix only scales it and nothing else. For an nxn matrix, you have at most n of these vectors which are linearly independent of each other and those eigenvectors correspond to the operator scaling the vector again in another way.
If all eigenvectors are independent, then you have found out how the operator transforms vectors and the eigen-vectors and eigenvalues tell you how this is done.
To understand the diagonalization aspect, I'd recommend you look at how rotations and transforms are done with matrices. This PDP^(-1) form is actually a general form that is studied where you have rotations, scaling, or transformations done in general co-ordinate systems. In the diagonalization example, you have a scaling matrix (eigen-values) as well as rotation type matrices. The rotation matrices should have a determinant of 1 since the scaling matrix will have a determinant corresponding to the product of the eigen-values.
Take a look at rotations to intuitively understand what is going on.
Whoops. Thanks. λ=4 and is repeated. Then substituting into Ax= λx gives
|A- λ|x=0 where A=|-1,-1;1,1|, so there is only one eigenvector and it can’t be diagonalized. You don’t have to find the eigenvector.
If you reverse the rows, then you do get complex eigenvalues, and it can’t be diagonalized (real). You don’t have to find the eigenvectors.