I have a square matrix A, that is nilpotent. I want to show that the eigenvalue, say L = 0.
I'm not sure how to approach this problem because I have used the definition of an eigenvalue and eigenvector of A corresponding to L:
AX = LX
But I cannot use the fact that A is nilpotent because I don't know how to prove that (A^m)X = (L^m)X. Can anybody help with this?
Thank you!
AH! Thank you very much! I was scratching my head wondering how lambda(AX) = [lambda^2]X but it was the initial identity we had all along.
I'm not sure if I should make another thread for this, but can you help me on a follow-up question? I want to show that the characteristic polynomial is x^n as well, but I'm not sure where to go with det(xI - A) = 0.
note that:
Av = λv for some non-zero vector v, means that:
A^{2}v = A(Av) = A(λv) = λ(Av) = λ(λv) = λ^{2}v.
proof by induction on n that for an eigenvector v of A with eigenvalue λ, A^{n}(v) = λ^{n}v:
for n = 1, we have Av = λv, which is true because λ is an eigenvalue belonging to the eigenvector v.
suppose that for n = k, A^{k} = λ^{k}v.
then A^{k+1}(v) = A(A^{k}v) = A(λ^{k}v) (by our induction hypothesis)
= λ^{k}(Av) = λ^{k}(λv) = λ^{k+1}v, which is the desired result for n = k+1.
therefore, we have that A^{n}(v) = λ^{n}v for all positive natural numbers n.
now.. if A is nilpotent, there is some m with A^{m} = 0.
so if λ is an eigenvalue for A, with eigenvector v, we have:
0 = 0v = A^{m}v = λ^{m}v, and since v is non-zero (being an eigenvector) λ^{m} = 0.
that is: λ is a root of x^{m} = (x - 0)^{m}, which has the sole root 0 of (algebraic) multiplicity m.
hence λ = 0, a nilpotent matrix can only have 0 eigenvalues.
one might ask: how do we know A has any eigenvectors at all?
it could be that A = 0, the 0-matrix. this is certainly nilpotent, and EVERY non-zero vector v is an eigenvector for A, with eigenvalue 0.
otherwise, A ≠ 0, so there is some positive integer r with:
A^{r} ≠ 0, A^{r+1} = 0.
since A^{r} ≠ 0, there exists some vector u ≠ 0, with A^{r}u ≠ 0.
let v = A^{r}u. this is a non-zero vector, and:
Av = A(A^{r}u) = A^{r+1}u = 0u = 0 = 0v, so we see v is an eigenvector (with eigenvalue 0).
thus we are fully justified in assuming A has (at least one) eigenvector(s), which then have associated eigenvalues (which we proved are all 0).
********
EDIT: for your second question, note if m is the SMALLEST positive integer with A^{m} = 0, then the minimal polynomial for A m(x) divides x^{m}.
thus m(x) = x^{t}, for some positive integer t ≤ m (note t ≠ 0, since the minimal polynomial of A is monic, and thus of degree ≥ 1).
but we have for all 0 < t < m, A^{t} ≠ 0 (since m is the smallest such positive integer with this property). hence t = m.
since by Cayley-Hamilton m(x)|p(x), the characteristic polynomial of A, we have p(x) = x^{m}q(x).
but the roots of q(x) are roots of p(x) as well, and every root of p(x) is 0 (from what we did above). hence q(x) = x^{n-m}, where n is the dimension of dom(A).