Math Help - How to tell if a matrix is diagonalizable?

1. How to tell if a matrix is diagonalizable?

If we have a 4x4 matrix A and it has rank = 3 (so nullity = 1) and the eigenvectors are 0, 0, 0, and some other nonzero number (so 0 is of multiplicity 3), then is A diagonalizable?

My understanding is that a matrix is diagonalizable if and only if it has linearly independent eigenvectors. It is possible for a matrix with a repeated eigenvalue to be diagonalizable if those repeated matrices form an eigenspace of dimension equal to their multiplicity.

Am I right in assuming that because the nullity is 1, that when the eigenvalue is 0, the N(A - LambdaI) = N(A + 0) = N(A) = dimension 1. That means that there are 3 eigenvalues, but there will be only 1 linearly independent vector that results because the basis is of dimension 1.

So this matrix cannot be diagonalizable.

This is supposed to be the right answer, but what if the other, nonzero eigenvalue (The 4th one) happened to have an eigenspace of dimension 3, so it could create 3 linearly independent vectors. Or is that not possible?

2. If the matrix has 0 as an eigenvalue with (algebraic) multiplicity 3 then, for this matrix to be diagonalizable, we must have 3 linearly independent vectors that map to 0, which can't be if the kernel has dimension 1.

3. Originally Posted by Lord Darkin
If we have a 4x4 matrix A and it has rank = 3 (so nullity = 1) and the eigenvectors are 0, 0, 0, and some other nonzero number (so 0 is of multiplicity 3), then is A diagonalizable?

My understanding is that a matrix is diagonalizable if and only if it has linearly independent eigenvectors. It is possible for a matrix with a repeated eigenvalue to be diagonalizable if those repeated matrices form an eigenspace of dimension equal to their multiplicity.

Am I right in assuming that because the nullity is 1, that when the eigenvalue is 0, the N(A - LambdaI) = N(A + 0) = N(A) = dimension 1. That means that there are 3 eigenvalues, but there will be only 1 linearly independent vector that results because the basis is of dimension 1.

So this matrix cannot be diagonalizable.

This is supposed to be the right answer, but what if the other, nonzero eigenvalue (The 4th one) happened to have an eigenspace of dimension 3, so it could create 3 linearly independent vectors. Or is that not possible?
Look at the matrix

$(A-\lambda I)$

Since zero is an eigenvalue of multiplicity 3 we get

$(A-0I)=A$

The we know that dimension of the nullspace is 1. So it gives only one linearly independent egienvector.

4. Okay, that part I understand.

So with the 3 eigenvalues of 0, we have only 1 lin. ind. eigenvector.

And that fourth, nonzero eigenvalue cannot have an eigenspace that is 3-D, right? Why would this not be true? (This is the only part of the question that I would need clarification.)

5. Originally Posted by Lord Darkin
Okay, that part I understand.

So with the 3 eigenvalues of 0, we have only 1 lin. ind. eigenvector.

And that fourth, nonzero eigenvalue cannot have an eigenspace that is 3-D, right? Why would this not be true? (This is the only part of the question that I would need clarification.)
The dimension of an eigenspace is always less than or equal to the multiplicity of the eigenvalue. So the max dimension can be 1.

6. All right, thanks!