Need [Quick] Help for Proof About Linear Transformations

Hi, I am TA'ing a graduate course on linear algebra this semester, and we have come up to a theorem that I cannot prove myself without resorting to Schur's Lemma and going into issues of density of invertible matrices, which is too advanced at this point in the class. The problem is as follows:

Let $\displaystyle T:V \rightarrow V$ be a linear operator on the finite-dimensional vector space $\displaystyle V$ over $\displaystyle F$; let $\displaystyle dim(V)=n$. Then, if for all bases of $\displaystyle V$, the matrix representation of $\displaystyle T$ is the same, then $\displaystyle T = \lambda I$, for some $\displaystyle \lambda \in F$.

Of course, I should have asked sooner, but the class meets in 12 hours from now =P So if anyone can wheel off an elementary proof pretty quickly, it would be awesome.

Re: Need [Quick] Help for Proof About Linear Transformations

I think that if you just permute the basis elements you'll actually have to permute both the rows and columns. I'm not entirely sure about this though.

Next, if the above is true, you can cyclically shift the basis elements by one position, obtaining that all the diagonal entries are the same. Then certain off-diagonal sections are individually equal to some constants. The goal would be to show these constants are all the same. I can imagine that using different permutations you can see that all the off-diagonal entries are the same. This is not much more so far, but it might be a good start.

Re: Need [Quick] Help for Proof About Linear Transformations

Quick observations and a proof except for the sole case of $\displaystyle \mathbb{F} = \mathbb{Z}/2\mathbb{Z}$.

1) $\displaystyle \exists \lambda \ni T_{ii} = \lambda \forall i$

That's because, given any basis, you can simply permute its order in how you define T in that basis.

2) TS = ST for all S that's an F-linear isomorphism of V. ( for all matrices S in GL(V, F), if you first nail down an initial prefered basis)

That's because that's how you change bases.

3) This problem seems like it might have a nice proof by induction on the dimension of V. Finding an n-1 subspace W such that T(W) contained in W seemed to be the key, and is doable somehow, but I just found this more direct proof, and so stopped searching for such an elegant proof.

4) A proof for $\displaystyle \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z}$:

Fix an initial basis. Choose any n elements $\displaystyle b_1, b_2, ... b_n$ in $\displaystyle \mathbb{F}-\{0\}$.

Let $\displaystyle S = diag(b_1, b_2, ... b_n)$. Then obviously $\displaystyle S \in GL(V, \mathbb{F})$.

Have $\displaystyle (TS)_{ij} = \Sigma_{k=1}^n T_{ik}S_{kj} = \Sigma_{k=1}^n T_{ik}(\delta_k^jS_{jj}) = T_{ij}S_{jj} = b_jT_{ij}$.

Similarly $\displaystyle (ST)_{ij} = \Sigma_{k=1}^n S_{ik}T_{kj} = \Sigma_{k=1}^n (\delta_k^iS_{ii})T_{kj} = S_{ii}T_{ij} = b_iT_{ij}$.

Since $\displaystyle TS = ST$, have $\displaystyle (TS)_{ij} = (ST)_{ij} \ \forall i, j \in \{1, 2, ..., n\}$. Thus $\displaystyle b_jT_{ij} = b_iT_{ij} \ \forall i, j \in \{1, 2, ..., n\}$.

But then $\displaystyle (b_j - b_i)T_{ij} = 0 \ \forall i, j \in \{1, 2, ..., n\}$, so whenever $\displaystyle b_j \ne b_i$, have $\displaystyle T_{ij} = 0$.

-----

Note that provided $\displaystyle \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z}$ (as I'll assume from now on), $\displaystyle \mathbb{F}$ has at least 2 non-zero elements. That's required for what follows.

Now, choose any pair $\displaystyle k, l \in \{1, 2, ..., n\}$ such that $\displaystyle k \ne l$. Will show $\displaystyle T_{kl} = 0$ by choosing an appropriate diangonal matrix S.

First choose $\displaystyle b_k \ne b_l$, both in $\displaystyle \mathbb{F}-\{0\}$ (we know that's possible), and then fill out the other $\displaystyle b_i$'s as needed (all 1's works)

to make that diagonal matrix $\displaystyle S$. Using such an $\displaystyle S$, it follows by the argument above that $\displaystyle T_{kl} = 0$.

Thus $\displaystyle T_{ij} = 0 \ \forall i \ne j, i, j \in \{1, 2, ..., n\}$.

By permuting any basis for $\displaystyle V$, it's clear that the diagonal entries of $\displaystyle T$ must all agree.

Thus, for $\displaystyle \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z}$, such a $\displaystyle T$ must be of the form $\displaystyle T = \lambda I$ for some $\displaystyle \lambda \in \mathbb{F}$.

-----

5) For $\displaystyle \mathbb{F} = \mathbb{Z}/2\mathbb{Z}$, and $\displaystyle n = \dim V = 2$, there are only 4 vectors, and by simple enumeration of possibilities for T and bases

you can show that T must be the identity. It's a lot to write in LaTex, so I won't bother.

Surely you could do this using elementary matricies, and probably in that generality avoid my $\displaystyle \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z}$ constraint.

Since the result popped out so easily for diagonal matricies, I didn't bother.

Re: Need [Quick] Help for Proof About Linear Transformations

just a sketch off the top of my head:

call the matrix representation of T, A. then PA = AP, for ANY invertible matrix P (which we can regard as a "change of basis" matrix).

if det(A) ≠ 0, then A lies in the center of GL(n,F), which consists of matrices of the form λI (this is not hard to prove), for λ ≠ 0 in F.

so the only thing left to prove is that if det(A) = 0, A = 0. note that if there is some u with Au ≠ 0, and and some v ≠ 0 with Av = 0,

extending {v} to a basis of V and extending {u+v} to a basis of V should yield different matrix representations of A.

since det(A) = 0, there is such a v, hence there can be no u, and that should do it.

Re: Need [Quick] Help for Proof About Linear Transformations

Quote:

Originally Posted by

**Deveno** just a sketch off the top of my head:

Thanks; I did find a proof (actually two) last night pretty much immediately after I posted this, and the one that I presented as a solution was pretty much along the same lines as this. From the PA = AP you can get around the issue of getting stuck in the special case of $\displaystyle GL_n(F)$ by proceeding from $\displaystyle [T]_{B_x, B_x} = [T]_{B_y, B_y}$ for all x, y:

$\displaystyle \sum_{i, j} \lambda_{i,j} P_{B_i, B_j}[T]_{B_x, B_x} = \sum_{i, j} [T]_{B_x, B_x}(\lambda_{i,j} P_{B_i, B_j}) \Rightarrow$

$\displaystyle [\sum_{i, j} \lambda_{i,j} P_{B_i, B_j}] \cdot [T]_{B_x, B_x} = [T]_{B_x, B_x} \cdot [\sum_{i, j} (\lambda_{i,j} P_{B_i, B_j})]$

Where the lambdas are arbitrary scalars, and just note that of course these P's span $\displaystyle F^{n \times n}$ given that they contain the permutation matrices and proceed in the obvious way.

I also found an induction on cofactor expansions, but I think that is more abstract, so I went with this more computational proof.