Results 1 to 14 of 14
Like Tree3Thanks
  • 1 Post By Hartlw
  • 2 Post By Deveno

Math Help - Proof about linear transformations

  1. #1
    Newbie
    Joined
    Mar 2013
    From
    Cali
    Posts
    20
    Thanks
    1

    Proof about linear transformations

    Hello everyone. I've been stuck with this for a while...
    Let V be a linear space of finite dimension and T:V->V a linear transformation. Show that T=rI where r is some real number and I is the identical transformation if and only if ToS=SoT for any linear transformation S:V->V.

    I could really use some help with the non-trivial implication
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member
    Joined
    Dec 2012
    From
    Athens, OH, USA
    Posts
    545
    Thanks
    219

    Re: Proof about linear transformations

    Hi,
    Can you prove that the only n by n matrices that commute with every n by n matrix are the scalar matrices? ( A matrix is scalar iff all non-diagonal elements are 0 and all elements on the main diagonal are the same.) Do you see that this answers your problem?

    If you need help proving the matrix result, post again to this thread.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Mar 2013
    From
    Cali
    Posts
    20
    Thanks
    1

    Re: Proof about linear transformations

    I do see it. But I can't seem to prove it
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Super Member
    Joined
    Aug 2010
    Posts
    894
    Thanks
    91

    Re: Proof about linear transformations

    Quote Originally Posted by johng View Post
    Hi,
    Can you prove that the only n by n matrices that commute with every n by n matrix are the scalar matrices?
    If you need help proving the matrix result, post again to this thread.
    I can't. I need help. Sufficiency is easy: if T=Ir, ST=TS for all S. Don't see necessity.

    I tried the following approach to OP: If TS=ST all S, let S be non-singular. Then S-1TS=T. Stuck. T=rI obviously works, but coudn't show that's a unique solution, besides 0. (necessity)
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Mar 2013
    From
    Cali
    Posts
    20
    Thanks
    1

    Re: Proof about linear transformations

    I just saw this approach on a different forum: They take v != 0 belonging to R^n and construct a basis B={v,b2,b3,...,bn}, then they define a matrix A such that Av=v and Abk=0. My only question is: How can we know this matrix exists for any basis? If it exists then TAv=ATv yields Tv=A(Tv), meaning Tv=rv, so every v belonging to R^n is an eigenvector of T. Finally they proceed to prove that all eigenvectors have the same eigenvalue.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Super Member
    Joined
    Aug 2010
    Posts
    894
    Thanks
    91

    Commuting Matrices

    Given: ST=TS for any S. Prove T=iR

    1) Let Ss be successiveley diagonal matrices with a single 1 on the diagonal.
    Then ST=TS for each such S gives off-diagonal terms of T must be 0.

    2) Let S have 1s in the first row and 0s in all others. And let T be the diagonal matrix from 1.
    Then ST=TS gives t11=t22=t33..=tnn

    ie, T=rI if ST=TS for any S. (The only matrix that commutes with any S is iR)

    How did I get it? The ugliest, least elegant way possible- by screwing around.

    Try it with a 3x3 T. It's interesting to see how it falls out. Succesively calculate ST=TS for
    S=(1,0,0;0,0,0;0,0,0)
    S=(0,0.0;0,1,0;0,0,0)
    S=(0,0,0;0,0,0;0,0,1)

    Then calculate ST=TS for T diagonal and S=(1,1,1;0,0,0;0,0,0)

    It's fast and easy.
    Last edited by Hartlw; January 7th 2014 at 01:06 PM. Reason: added title for reference
    Thanks from sebasvargasl
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,154
    Thanks
    595

    Re: Proof about linear transformations

    Let's try this in a different way: we'll use induction on dim(V). We will suppose we have chosen an appropriate basis for V, and identified HomF(V,V) with Matnxn(F) using this basis, for each n, so we can talk about matrices, instead of abstract linear transformations.

    The 1-dimensional case (base case) is easy: EVERY 1x1 matrix is a scalar times 1, the 1x1 identity matrix, and matrix multiplication is just the field multiplication, and all 1x1 matrices commute with each other.

    Now suppose that we have ST = TS for all nxn matrices S, if and only if T = rIn.

    Now it should be clear that if T = rIn+1, that ST = TS, for every (n+1)x(n+1) matrix S, so we only need to show for our inductive step that if T DOES commute with every matrix S, T = rIn+1.

    To do this, we will write (using block matrix multiplication):

    S = \begin{bmatrix}S_1&S_2\\S_3&S_4 \end{bmatrix};\ T = \begin{bmatrix}T_1&T_2\\T_3&T_4 \end{bmatrix}

    where S1,T1 are nxn matrices, S2,T2 are nx1 matrices, S3,T3 are 1xn matrices, and S4,T4 are just the just the n,n-entries of S and T, respectively.

    Now, we have:

    ST = \begin{bmatrix}S_1T_1 + S_2T_3&S_1T_2+S_2T_4\\S_3T_1+S_4T_3&S_3T_2+S_4T_4 \end{bmatrix}

    while:

    TS = \begin{bmatrix}T_1S_1 + T_2S_3&T_1S_2+T_2S_4\\T_3S_1+T_4S_3&T_3S_2+T_4S_4 \end{bmatrix}

    Our goal is to first prove that if: T_2,T_3 \neq 0, we can find SOME matrix that doesn't commute with T.

    Let's look at T2, first. If T2 is not all 0's, let tk(n+1) be the first non-zero entry (these are all in the last column of T, so the "k" is really all we're interested in).

    For our "S", we'll use E(n+1)k, which has all 0's except for the n+1,k-entry which is 1, in other words:

    S1 = 0, S2 = 0, S4 = 0, and S3 = (0,0,...1,...0) (a row-vector with 1 in the k-th place). This gives us:

    ST = \begin{bmatrix}0&0\\S_3T_1&S_3T_2 \end{bmatrix}

    TS = \begin{bmatrix}T_2S_3&0\\T_4S_3&0 \end{bmatrix}

    We need to show these 2 matrices are not equal. Note that in ST, the 1x1 matrix in the lower right is S3T2, which is just the dot product of ek and T2, which returns the k-th coordinate of T2, which is t(n+1)k ≠ 0.

    However, in TS, this block is 0, so these two matrices CANNOT be equal (we don't even need to look at the other parts).

    So if T is to commute with EVERY S (including the particular matrix E(n+1)k), we MUST have that the T2 block is all zero.

    Hopefully, you can guess what is coming next: if T3 is not all 0's, let t(n+1)k be the first non-zero entry. This time, we'll use the "S" matrix S = Ek(n+1), so that:

    S1 = 0, S3 = 0, S4 = 0, and S2 = ek (as a column vector). Here, we have:

    ST = \begin{bmatrix}S_2T_3&S_2T_4\\0&0 \end{bmatrix}

    TS = \begin{bmatrix}0&T_1S_2\\0&T_3S_2 \end{bmatrix}.

    Again, we only have to look at the lower right corner, in ST, this block is 0; while in TS it is T3ek, which again returns the k-th entry of T3 = t(n+1)k ≠ 0.

    So, here, again, we find that if T is to commute with EVERY S, then T3 = 0.

    This means that T is of the form:

    T = \begin{bmatrix}T_1&0\\0&T_4 \end{bmatrix}, and we have simpler forms for ST and TS:

    ST = \begin{bmatrix} S_1T_1&S_2T_4\\S_3T_1&S_4T_4 \end{bmatrix}

    TS = \begin{bmatrix} T_1S_1&T_1S_2\\T_4S_3&T_4S_4 \end{bmatrix}

    Now, comparing these 2 matrices (in particular at the nxn block in the upper left), we see that for them to be equal, we must have: S1T1 = T1S1.

    For this to be true of ANY nxn matrix S1, we must have T1 = rIn, by our induction hypothesis. So we can simplify even further:

    T4 is just a single number, let's call it r'. Writing ST and TS with what we now know, we have:

    ST = \begin{bmatrix}rS_1&r'S_2\\rS_3&r'S_4 \end{bmatrix}

    TS = \begin{bmatrix}rS_1&rS_2\\r'S_3&r'S_4 \end{bmatrix}

    Comparing the upper right blocks together, and the lower left blocks together, we see for these two matrices to be equal we need r = r', thus:

    T = \begin{bmatrix}rI_n&0\\0&r \end{bmatrix} = rI_{n+1}, quod erat demonstrandum.
    Last edited by Deveno; January 7th 2014 at 04:18 PM.
    Thanks from sebasvargasl and Hartlw
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Newbie
    Joined
    Mar 2013
    From
    Cali
    Posts
    20
    Thanks
    1

    Re: Proof about linear transformations

    Masterly proof Deveno. Thank you so much.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Super Member
    Joined
    Aug 2010
    Posts
    894
    Thanks
    91

    Re: Proof about linear transformations

    Deveno has a good start. As a matter of interest, lets pick up from:
    TS=
    ST=
    ("reply with quote" didn't work)

    Continuing on a different track as a matter of interest:

    Let S2 = S3 = 0.
    Then:
    ST = (S1T1,S1T2 ; S4T3,S4T4)
    TS = (T1S1,T2S4 ; T2S1,S4T4)

    S1=0, S4≠0 -> T3=0
    S4=0, S1≠0 -> T2=0

    So T is diagonal with n rs and one r.
    To force r=r, Use S with first row all 1s and everything else 0.

    Therefore T=rI for n+1 X n+1
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Super Member
    Joined
    Aug 2010
    Posts
    894
    Thanks
    91

    Re: Proof about linear transformations

    Actually, still using Devenos notation, its easier to start with:
    S=(S1,0;0,1) T= (T1,T2;T3,T4)
    Then from ST=TS you get:
    S1T2=T2, & S1=0 -> T2=0
    T3S1=T3, & S1=0 -> T3=0

    So T is diagonal with n rs and one r.
    To force r=r, Use S with first row all 1s and everything else 0.

    Therefore T=rI for n+1X n+1
    Follow Math Help Forum on Facebook and Google+

  11. #11
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,154
    Thanks
    595

    Re: Proof about linear transformations

    Quote Originally Posted by Hartlw View Post
    Actually, still using Devenos notation, its easier to start with:
    S=(S1,0;0,1) T= (T1,T2;T3,T4)
    Then from ST=TS you get:
    S1T2=T2, & S1=0 -> T2=0
    T3S1=T3, & S1=0 -> T3=0

    So T is diagonal with n rs and one r.
    To force r=r, Use S with first row all 1s and everything else 0.

    Therefore T=rI for n+1X n+1
    It took me a while to get what you were saying, but, yes: if we choose our "S" to be:

    \begin{bmatrix}0&0\\0&1 \end{bmatrix} then:

    ST = TS \iff \begin{bmatrix}0&0\\T_3&T_4 \end{bmatrix} = \begin{bmatrix}0&T_2\\0&T_4 \end{bmatrix}

    which gives T_2,T_3 = 0 with much less work. So that makes for a substantial improvement.

    I think the matrix:

    S = \begin{bmatrix}I_n&0\\e_1^T&0 \end{bmatrix}

    works just as well to show r = r', but starting from:

    "T4 is just a single number..."

    it's clear that the off-diagonal (block) elements only differ by the scalar in front, so the two scalars r,r' must be the same:

    If (for example) rS1 = r'S1, then rS1 - r'S1 = 0, thus (r - r')S1 = 0.

    Since we can choose S1 freely, we can choose it to be non-zero, which then forces r-r' = 0 (for any vector space V, with v in V (and kxm matrices do form a vector space for any k,m), if av = 0 with v non-zero, then a = 0).

    That said, there is nothing wrong with picking the S you prefer, it clearly works quite well.

    *******

    (Project Crazy Project has a nice proof as well, but it's a little "subscript-heavy" if you know what I mean)
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Super Member
    Joined
    Aug 2010
    Posts
    894
    Thanks
    91

    Re: Proof about linear transformations

    Deveno, you write (in quotes):

    it's clear that the off-diagonal (block) elements only differ by the scalar in front, so the two scalars r,r' must be the same: Not clear to me at all.

    If (for example) rS1 = r'S1, then rS1 - r'S1 = 0, thus (r - r')S1 = 0. Where does this come from?

    Instead of my previous proof of r=r, you might prefer:
    S=(I,1;0,0), T=(rI,0;0,r) , which gives from ST=TS:
    r=r

    EDIT:
    In very, very abbreviated notation (S1=nxn and the others have to satisfy requirements of partitioning), you could also use:
    S=(S1,1;0,0), T=(r,0;0,r), ST=TS -> r=r'
    Last edited by Hartlw; January 9th 2014 at 06:00 AM.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Super Member
    Joined
    Aug 2010
    Posts
    894
    Thanks
    91

    Re: Proof about linear transformations

    Quote Originally Posted by Deveno View Post
    It took me a while to get what you were saying, but, yes: if we choose our "S" to be:

    \begin{bmatrix}0&0\\0&1 \end{bmatrix} then:

    ST = TS \iff \begin{bmatrix}0&0\\T_3&T_4 \end{bmatrix} = \begin{bmatrix}0&T_2\\0&T_4 \end{bmatrix}

    which gives T_2,T_3 = 0 with much less work. So that makes for a substantial improvement.

    I think the matrix:

    S = \begin{bmatrix}I_n&0\\e_1^T&0 \end{bmatrix}

    works just as well to show r = r', but starting from:
    That's it. Perfect. End of proof, end of story. What's the point of the rest?

    Because I didn't understand e1T, which is (1,0,0...), 1xn, I simply duplicated your r=r' proof without knowing it. Sorry.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    Super Member
    Joined
    Aug 2010
    Posts
    894
    Thanks
    91

    Re: Proof about linear transformations

    Just as a matter of consistency of style with the rest of the proof:
    S=(S1,0;0,1), T=(T1,0;0,T4) gives T4=r
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Linear Algebra Linear transformations isomorphic
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: November 3rd 2013, 08:10 AM
  2. Linear Algebra Linear transformations Vector spaces
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: November 3rd 2013, 08:08 AM
  3. Linear Algebra Proof Regarding Linear Transformations
    Posted in the Advanced Applied Math Forum
    Replies: 1
    Last Post: March 1st 2013, 12:53 AM
  4. Need [Quick] Help for Proof About Linear Transformations
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: September 18th 2012, 08:21 PM
  5. Replies: 3
    Last Post: June 2nd 2007, 10:08 AM

Search Tags


/mathhelpforum @mathhelpforum