Results 1 to 5 of 5

Math Help - Need [Quick] Help for Proof About Linear Transformations

  1. #1
    Newbie
    Joined
    Nov 2011
    Posts
    24

    Need [Quick] Help for Proof About Linear Transformations

    Hi, I am TA'ing a graduate course on linear algebra this semester, and we have come up to a theorem that I cannot prove myself without resorting to Schur's Lemma and going into issues of density of invertible matrices, which is too advanced at this point in the class. The problem is as follows:

    Let T:V \rightarrow V be a linear operator on the finite-dimensional vector space V over F; let dim(V)=n. Then, if for all bases of V, the matrix representation of T is the same, then T = \lambda I, for some \lambda \in F.


    Of course, I should have asked sooner, but the class meets in 12 hours from now =P So if anyone can wheel off an elementary proof pretty quickly, it would be awesome.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Jul 2010
    From
    Vancouver
    Posts
    432
    Thanks
    16

    Re: Need [Quick] Help for Proof About Linear Transformations

    I think that if you just permute the basis elements you'll actually have to permute both the rows and columns. I'm not entirely sure about this though.

    Next, if the above is true, you can cyclically shift the basis elements by one position, obtaining that all the diagonal entries are the same. Then certain off-diagonal sections are individually equal to some constants. The goal would be to show these constants are all the same. I can imagine that using different permutations you can see that all the off-diagonal entries are the same. This is not much more so far, but it might be a good start.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Super Member
    Joined
    Sep 2012
    From
    Washington DC USA
    Posts
    525
    Thanks
    146

    Re: Need [Quick] Help for Proof About Linear Transformations

    Quick observations and a proof except for the sole case of \mathbb{F} = \mathbb{Z}/2\mathbb{Z}.
    1) \exists \lambda \ni T_{ii} = \lambda \forall i
    That's because, given any basis, you can simply permute its order in how you define T in that basis.
    2) TS = ST for all S that's an F-linear isomorphism of V. ( for all matrices S in GL(V, F), if you first nail down an initial prefered basis)
    That's because that's how you change bases.
    3) This problem seems like it might have a nice proof by induction on the dimension of V. Finding an n-1 subspace W such that T(W) contained in W seemed to be the key, and is doable somehow, but I just found this more direct proof, and so stopped searching for such an elegant proof.
    4) A proof for \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z}:

    Fix an initial basis. Choose any n elements b_1, b_2, ... b_n in \mathbb{F}-\{0\}.

    Let S = diag(b_1, b_2, ... b_n). Then obviously S \in GL(V, \mathbb{F}).

    Have (TS)_{ij} = \Sigma_{k=1}^n T_{ik}S_{kj} = \Sigma_{k=1}^n T_{ik}(\delta_k^jS_{jj}) = T_{ij}S_{jj} = b_jT_{ij}.

    Similarly (ST)_{ij} = \Sigma_{k=1}^n S_{ik}T_{kj} = \Sigma_{k=1}^n (\delta_k^iS_{ii})T_{kj} = S_{ii}T_{ij} = b_iT_{ij}.

    Since TS = ST, have (TS)_{ij} = (ST)_{ij} \ \forall i, j \in \{1, 2, ..., n\}. Thus b_jT_{ij} = b_iT_{ij} \ \forall i, j \in \{1, 2, ..., n\}.

    But then (b_j - b_i)T_{ij} = 0 \ \forall i, j \in \{1, 2, ..., n\}, so whenever b_j \ne b_i, have T_{ij} = 0.

    -----

    Note that provided \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z} (as I'll assume from now on), \mathbb{F} has at least 2 non-zero elements. That's required for what follows.

    Now, choose any pair k, l \in \{1, 2, ..., n\} such that k \ne l. Will show T_{kl} = 0 by choosing an appropriate diangonal matrix S.

    First choose b_k \ne b_l, both in \mathbb{F}-\{0\} (we know that's possible), and then fill out the other b_i's as needed (all 1's works)

    to make that diagonal matrix S. Using such an S, it follows by the argument above that T_{kl} = 0.

    Thus T_{ij} = 0 \ \forall i \ne j, i, j \in \{1, 2, ..., n\}.

    By permuting any basis for V, it's clear that the diagonal entries of T must all agree.

    Thus, for \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z}, such a T must be of the form T = \lambda I for some \lambda \in \mathbb{F}.

    -----

    5) For \mathbb{F} = \mathbb{Z}/2\mathbb{Z}, and n = \dim V = 2, there are only 4 vectors, and by simple enumeration of possibilities for T and bases

    you can show that T must be the identity. It's a lot to write in LaTex, so I won't bother.

    Surely you could do this using elementary matricies, and probably in that generality avoid my \mathbb{F} \ne \mathbb{Z}/2\mathbb{Z} constraint.

    Since the result popped out so easily for diagonal matricies, I didn't bother.
    Last edited by johnsomeone; September 18th 2012 at 01:46 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,317
    Thanks
    697

    Re: Need [Quick] Help for Proof About Linear Transformations

    just a sketch off the top of my head:

    call the matrix representation of T, A. then PA = AP, for ANY invertible matrix P (which we can regard as a "change of basis" matrix).

    if det(A) ≠ 0, then A lies in the center of GL(n,F), which consists of matrices of the form λI (this is not hard to prove), for λ ≠ 0 in F.

    so the only thing left to prove is that if det(A) = 0, A = 0. note that if there is some u with Au ≠ 0, and and some v ≠ 0 with Av = 0,

    extending {v} to a basis of V and extending {u+v} to a basis of V should yield different matrix representations of A.

    since det(A) = 0, there is such a v, hence there can be no u, and that should do it.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Nov 2011
    Posts
    24

    Re: Need [Quick] Help for Proof About Linear Transformations

    Quote Originally Posted by Deveno View Post
    just a sketch off the top of my head:
    Thanks; I did find a proof (actually two) last night pretty much immediately after I posted this, and the one that I presented as a solution was pretty much along the same lines as this. From the PA = AP you can get around the issue of getting stuck in the special case of GL_n(F) by proceeding from [T]_{B_x, B_x} = [T]_{B_y, B_y} for all x, y:

    \sum_{i, j} \lambda_{i,j} P_{B_i, B_j}[T]_{B_x, B_x} = \sum_{i, j} [T]_{B_x, B_x}(\lambda_{i,j} P_{B_i, B_j}) \Rightarrow

    [\sum_{i, j} \lambda_{i,j} P_{B_i, B_j}] \cdot [T]_{B_x, B_x} = [T]_{B_x, B_x} \cdot [\sum_{i, j} (\lambda_{i,j} P_{B_i, B_j})]

    Where the lambdas are arbitrary scalars, and just note that of course these P's span F^{n \times n} given that they contain the permutation matrices and proceed in the obvious way.

    I also found an induction on cofactor expansions, but I think that is more abstract, so I went with this more computational proof.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Linear Transformations and the General Linear Group
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: December 26th 2011, 10:50 AM
  2. Linear Map transformations (Linear Algebra)
    Posted in the Algebra Forum
    Replies: 4
    Last Post: October 21st 2011, 09:56 AM
  3. Basic Linear Algebra - Linear Transformations Help
    Posted in the Advanced Algebra Forum
    Replies: 6
    Last Post: December 7th 2010, 03:59 PM
  4. Linear Transformations and Linear Independence
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: November 6th 2008, 07:36 PM
  5. Replies: 3
    Last Post: June 2nd 2007, 10:08 AM

Search Tags


/mathhelpforum @mathhelpforum