Need [Quick] Help for Proof About Linear Transformations

Hi, I am TA'ing a graduate course on linear algebra this semester, and we have come up to a theorem that I cannot prove myself without resorting to Schur's Lemma and going into issues of density of invertible matrices, which is too advanced at this point in the class. The problem is as follows:

Let be a linear operator on the finite-dimensional vector space over ; let . Then, if for all bases of , the matrix representation of is the same, then , for some .

Of course, I should have asked sooner, but the class meets in 12 hours from now =P So if anyone can wheel off an elementary proof pretty quickly, it would be awesome.

Re: Need [Quick] Help for Proof About Linear Transformations

I think that if you just permute the basis elements you'll actually have to permute both the rows and columns. I'm not entirely sure about this though.

Next, if the above is true, you can cyclically shift the basis elements by one position, obtaining that all the diagonal entries are the same. Then certain off-diagonal sections are individually equal to some constants. The goal would be to show these constants are all the same. I can imagine that using different permutations you can see that all the off-diagonal entries are the same. This is not much more so far, but it might be a good start.

Re: Need [Quick] Help for Proof About Linear Transformations

Quick observations and a proof except for the sole case of .

1)

That's because, given any basis, you can simply permute its order in how you define T in that basis.

2) TS = ST for all S that's an F-linear isomorphism of V. ( for all matrices S in GL(V, F), if you first nail down an initial prefered basis)

That's because that's how you change bases.

3) This problem seems like it might have a nice proof by induction on the dimension of V. Finding an n-1 subspace W such that T(W) contained in W seemed to be the key, and is doable somehow, but I just found this more direct proof, and so stopped searching for such an elegant proof.

4) A proof for :

Fix an initial basis. Choose any n elements in .

Let . Then obviously .

Have .

Similarly .

Since , have . Thus .

But then , so whenever , have .

-----

Note that provided (as I'll assume from now on), has at least 2 non-zero elements. That's required for what follows.

Now, choose any pair such that . Will show by choosing an appropriate diangonal matrix S.

First choose , both in (we know that's possible), and then fill out the other 's as needed (all 1's works)

to make that diagonal matrix . Using such an , it follows by the argument above that .

Thus .

By permuting any basis for , it's clear that the diagonal entries of must all agree.

Thus, for , such a must be of the form for some .

-----

5) For , and , there are only 4 vectors, and by simple enumeration of possibilities for T and bases

you can show that T must be the identity. It's a lot to write in LaTex, so I won't bother.

Surely you could do this using elementary matricies, and probably in that generality avoid my constraint.

Since the result popped out so easily for diagonal matricies, I didn't bother.

Re: Need [Quick] Help for Proof About Linear Transformations

just a sketch off the top of my head:

call the matrix representation of T, A. then PA = AP, for ANY invertible matrix P (which we can regard as a "change of basis" matrix).

if det(A) ≠ 0, then A lies in the center of GL(n,F), which consists of matrices of the form λI (this is not hard to prove), for λ ≠ 0 in F.

so the only thing left to prove is that if det(A) = 0, A = 0. note that if there is some u with Au ≠ 0, and and some v ≠ 0 with Av = 0,

extending {v} to a basis of V and extending {u+v} to a basis of V should yield different matrix representations of A.

since det(A) = 0, there is such a v, hence there can be no u, and that should do it.

Re: Need [Quick] Help for Proof About Linear Transformations