Transformation raised to power

This problem comes from our Final Review. It states:

Suppose V is a vector space over a field F, and T: V-->V is a linear map. Suppose also that, for a vector v in V, that T^(k+1) * v = 0 for some positive integer k. If (T^k) is not equal to zero, show:

i.) {v, Tv, (T^2)v .. (T^k)v} is linearly independent;

ii.) The span of {v, Tv, (T^2)v .. (T^k)v} is a T-invariant subspace of V.

Our professor hasn't been using a textbook and google doesn't understand maths (Headbang).

Re: Transformation raised to power

i)let v be a vector in V such that but . suppose that .

taking of both sides, every term but vanishes. since , we must have .

thus we have , and now we take of both sides to show that .

continuing in this way, we eventually find that , which shows that is linearly independent.

ii) if , then

so that

(since ,

which is certainly in , hence this span is a T-invariant subspace (if we call it W, T(W) is contained in W).

Re: Transformation raised to power

I'd like to comment the following: if , is the restriction of to , and we consider the basis of given by then,

important result in the theory of canonical forms of Jordan.

Re: Transformation raised to power

yes, since T is nilpotent on W, of degree k+1. while it is not always true that we can find a diagonal matrix D such that a change of basis diagonalizes A, for a given matrix A

that is , we CAN find a matrix D and a matrix N, where D is diagonal, and N is nilpotent, such that .

(well, almost. to be sure we can do so, all the eigenvalues of A, have to lie in the field F, so it is usually assumed one is working within an algebraically closed field,

such as the complex numbers. that is, the eigenvalues of a REAL matrix, may be complex, so to get the Jordan form, we need to "enlarge the field").

what happens is, when we look at a factor of det(xI - A), it may be that we have less than k linearly independent eigenvectors.

but (λI - A) will be nilpotent, of degree ≤ k, so the block of corresponding to the eigenspace

will be "almost diagonal", it will be a diagonal matrix, with some 1's on the super-diagonal.

a vector v that works as in your problem (where but )

is called a "generalized eigenvector corresponding to λ" (regular eigenvectors correspond to m = 0).