1. ## Proof about linear transformations

Hello everyone. I've been stuck with this for a while...
Let V be a linear space of finite dimension and T:V->V a linear transformation. Show that T=rI where r is some real number and I is the identical transformation if and only if ToS=SoT for any linear transformation S:V->V.

I could really use some help with the non-trivial implication

2. ## Re: Proof about linear transformations

Hi,
Can you prove that the only n by n matrices that commute with every n by n matrix are the scalar matrices? ( A matrix is scalar iff all non-diagonal elements are 0 and all elements on the main diagonal are the same.) Do you see that this answers your problem?

If you need help proving the matrix result, post again to this thread.

3. ## Re: Proof about linear transformations

I do see it. But I can't seem to prove it

4. ## Re: Proof about linear transformations

Originally Posted by johng
Hi,
Can you prove that the only n by n matrices that commute with every n by n matrix are the scalar matrices?
If you need help proving the matrix result, post again to this thread.
I can't. I need help. Sufficiency is easy: if T=Ir, ST=TS for all S. Don't see necessity.

I tried the following approach to OP: If TS=ST all S, let S be non-singular. Then S-1TS=T. Stuck. T=rI obviously works, but coudn't show that's a unique solution, besides 0. (necessity)

5. ## Re: Proof about linear transformations

I just saw this approach on a different forum: They take v != 0 belonging to R^n and construct a basis B={v,b2,b3,...,bn}, then they define a matrix A such that Av=v and Abk=0. My only question is: How can we know this matrix exists for any basis? If it exists then TAv=ATv yields Tv=A(Tv), meaning Tv=rv, so every v belonging to R^n is an eigenvector of T. Finally they proceed to prove that all eigenvectors have the same eigenvalue.

6. ## Commuting Matrices

Given: ST=TS for any S. Prove T=iR

1) Let S’s be successiveley diagonal matrices with a single 1 on the diagonal.
Then ST=TS for each such S gives off-diagonal terms of T must be 0.

2) Let S have 1’s in the first row and 0’s in all others. And let T be the diagonal matrix from 1.
Then ST=TS gives t11=t22=t33…..=tnn

ie, T=rI if ST=TS for any S. (The only matrix that commutes with any S is iR)

How did I get it? The ugliest, least elegant way possible- by screwing around.

Try it with a 3x3 T. It's interesting to see how it falls out. Succesively calculate ST=TS for
S=(1,0,0;0,0,0;0,0,0)
S=(0,0.0;0,1,0;0,0,0)
S=(0,0,0;0,0,0;0,0,1)

Then calculate ST=TS for T diagonal and S=(1,1,1;0,0,0;0,0,0)

It's fast and easy.

7. ## Re: Proof about linear transformations

Let's try this in a different way: we'll use induction on dim(V). We will suppose we have chosen an appropriate basis for V, and identified HomF(V,V) with Matnxn(F) using this basis, for each n, so we can talk about matrices, instead of abstract linear transformations.

The 1-dimensional case (base case) is easy: EVERY 1x1 matrix is a scalar times 1, the 1x1 identity matrix, and matrix multiplication is just the field multiplication, and all 1x1 matrices commute with each other.

Now suppose that we have ST = TS for all nxn matrices S, if and only if T = rIn.

Now it should be clear that if T = rIn+1, that ST = TS, for every (n+1)x(n+1) matrix S, so we only need to show for our inductive step that if T DOES commute with every matrix S, T = rIn+1.

To do this, we will write (using block matrix multiplication):

$\displaystyle S = \begin{bmatrix}S_1&S_2\\S_3&S_4 \end{bmatrix};\ T = \begin{bmatrix}T_1&T_2\\T_3&T_4 \end{bmatrix}$

where S1,T1 are nxn matrices, S2,T2 are nx1 matrices, S3,T3 are 1xn matrices, and S4,T4 are just the just the n,n-entries of S and T, respectively.

Now, we have:

$\displaystyle ST = \begin{bmatrix}S_1T_1 + S_2T_3&S_1T_2+S_2T_4\\S_3T_1+S_4T_3&S_3T_2+S_4T_4 \end{bmatrix}$

while:

$\displaystyle TS = \begin{bmatrix}T_1S_1 + T_2S_3&T_1S_2+T_2S_4\\T_3S_1+T_4S_3&T_3S_2+T_4S_4 \end{bmatrix}$

Our goal is to first prove that if: $\displaystyle T_2,T_3 \neq 0$, we can find SOME matrix that doesn't commute with T.

Let's look at T2, first. If T2 is not all 0's, let tk(n+1) be the first non-zero entry (these are all in the last column of T, so the "k" is really all we're interested in).

For our "S", we'll use E(n+1)k, which has all 0's except for the n+1,k-entry which is 1, in other words:

S1 = 0, S2 = 0, S4 = 0, and S3 = (0,0,...1,...0) (a row-vector with 1 in the k-th place). This gives us:

$\displaystyle ST = \begin{bmatrix}0&0\\S_3T_1&S_3T_2 \end{bmatrix}$

$\displaystyle TS = \begin{bmatrix}T_2S_3&0\\T_4S_3&0 \end{bmatrix}$

We need to show these 2 matrices are not equal. Note that in ST, the 1x1 matrix in the lower right is S3T2, which is just the dot product of ek and T2, which returns the k-th coordinate of T2, which is t(n+1)k ≠ 0.

However, in TS, this block is 0, so these two matrices CANNOT be equal (we don't even need to look at the other parts).

So if T is to commute with EVERY S (including the particular matrix E(n+1)k), we MUST have that the T2 block is all zero.

Hopefully, you can guess what is coming next: if T3 is not all 0's, let t(n+1)k be the first non-zero entry. This time, we'll use the "S" matrix S = Ek(n+1), so that:

S1 = 0, S3 = 0, S4 = 0, and S2 = ek (as a column vector). Here, we have:

$\displaystyle ST = \begin{bmatrix}S_2T_3&S_2T_4\\0&0 \end{bmatrix}$

$\displaystyle TS = \begin{bmatrix}0&T_1S_2\\0&T_3S_2 \end{bmatrix}$.

Again, we only have to look at the lower right corner, in ST, this block is 0; while in TS it is T3ek, which again returns the k-th entry of T3 = t(n+1)k ≠ 0.

So, here, again, we find that if T is to commute with EVERY S, then T3 = 0.

This means that T is of the form:

$\displaystyle T = \begin{bmatrix}T_1&0\\0&T_4 \end{bmatrix}$, and we have simpler forms for ST and TS:

$\displaystyle ST = \begin{bmatrix} S_1T_1&S_2T_4\\S_3T_1&S_4T_4 \end{bmatrix}$

$\displaystyle TS = \begin{bmatrix} T_1S_1&T_1S_2\\T_4S_3&T_4S_4 \end{bmatrix}$

Now, comparing these 2 matrices (in particular at the nxn block in the upper left), we see that for them to be equal, we must have: S1T1 = T1S1.

For this to be true of ANY nxn matrix S1, we must have T1 = rIn, by our induction hypothesis. So we can simplify even further:

T4 is just a single number, let's call it r'. Writing ST and TS with what we now know, we have:

$\displaystyle ST = \begin{bmatrix}rS_1&r'S_2\\rS_3&r'S_4 \end{bmatrix}$

$\displaystyle TS = \begin{bmatrix}rS_1&rS_2\\r'S_3&r'S_4 \end{bmatrix}$

Comparing the upper right blocks together, and the lower left blocks together, we see for these two matrices to be equal we need r = r', thus:

$\displaystyle T = \begin{bmatrix}rI_n&0\\0&r \end{bmatrix} = rI_{n+1}$, quod erat demonstrandum.

8. ## Re: Proof about linear transformations

Masterly proof Deveno. Thank you so much.

9. ## Re: Proof about linear transformations

Deveno has a good start. As a matter of interest, lets pick up from:
TS=
ST=

Continuing on a different track as a matter of interest:

Let S2 = S3 = 0.
Then:
ST = (S1T1,S1T2 ; S4T3,S4T4)
TS = (T1S1,T2S4 ; T2S1,S4T4)

S1=0, S4≠0 -> T3=0
S4=0, S1≠0 -> T2=0

So T is diagonal with n r’s and one r’.
To force r’=r, Use S with first row all 1’s and everything else 0.

Therefore T=rI for n+1 X n+1

10. ## Re: Proof about linear transformations

S=(S1,0;0,1) T= (T1,T2;T3,T4)
Then from ST=TS you get:
S1T2=T2, & S1=0 -> T2=0
T3S1=T3, & S1=0 -> T3=0

So T is diagonal with n r’s and one r’.
To force r’=r, Use S with first row all 1’s and everything else 0.

Therefore T=rI for n+1X n+1

11. ## Re: Proof about linear transformations

Originally Posted by Hartlw
S=(S1,0;0,1) T= (T1,T2;T3,T4)
Then from ST=TS you get:
S1T2=T2, & S1=0 -> T2=0
T3S1=T3, & S1=0 -> T3=0

So T is diagonal with n r’s and one r’.
To force r’=r, Use S with first row all 1’s and everything else 0.

Therefore T=rI for n+1X n+1
It took me a while to get what you were saying, but, yes: if we choose our "S" to be:

$\displaystyle \begin{bmatrix}0&0\\0&1 \end{bmatrix}$ then:

$\displaystyle ST = TS \iff \begin{bmatrix}0&0\\T_3&T_4 \end{bmatrix} = \begin{bmatrix}0&T_2\\0&T_4 \end{bmatrix}$

which gives $\displaystyle T_2,T_3 = 0$ with much less work. So that makes for a substantial improvement.

I think the matrix:

$\displaystyle S = \begin{bmatrix}I_n&0\\e_1^T&0 \end{bmatrix}$

works just as well to show r = r', but starting from:

"T4 is just a single number..."

it's clear that the off-diagonal (block) elements only differ by the scalar in front, so the two scalars r,r' must be the same:

If (for example) rS1 = r'S1, then rS1 - r'S1 = 0, thus (r - r')S1 = 0.

Since we can choose S1 freely, we can choose it to be non-zero, which then forces r-r' = 0 (for any vector space V, with v in V (and kxm matrices do form a vector space for any k,m), if av = 0 with v non-zero, then a = 0).

That said, there is nothing wrong with picking the S you prefer, it clearly works quite well.

*******

(Project Crazy Project has a nice proof as well, but it's a little "subscript-heavy" if you know what I mean)

12. ## Re: Proof about linear transformations

Deveno, you write (in quotes):

“it's clear that the off-diagonal (block) elements only differ by the scalar in front, so the two scalars r,r' must be the same:” Not clear to me at all.

“If (for example) rS1 = r'S1, then rS1 - r'S1 = 0, thus (r - r')S1 = 0.” Where does this come from?

Instead of my previous proof of r=r’, you might prefer:
S=(I,1;0,0), T=(rI,0;0,r’) , which gives from ST=TS:
r=r’

EDIT:
In very, very abbreviated notation (S1=nxn and the others have to satisfy requirements of partitioning), you could also use:
S=(S1,1;0,0), T=(r,0;0,r’), ST=TS -> r=r'

13. ## Re: Proof about linear transformations

Originally Posted by Deveno
It took me a while to get what you were saying, but, yes: if we choose our "S" to be:

$\displaystyle \begin{bmatrix}0&0\\0&1 \end{bmatrix}$ then:

$\displaystyle ST = TS \iff \begin{bmatrix}0&0\\T_3&T_4 \end{bmatrix} = \begin{bmatrix}0&T_2\\0&T_4 \end{bmatrix}$

which gives $\displaystyle T_2,T_3 = 0$ with much less work. So that makes for a substantial improvement.

I think the matrix:

$\displaystyle S = \begin{bmatrix}I_n&0\\e_1^T&0 \end{bmatrix}$

works just as well to show r = r', but starting from:
That's it. Perfect. End of proof, end of story. What's the point of the rest?

Because I didn't understand e1T, which is (1,0,0...), 1xn, I simply duplicated your r=r' proof without knowing it. Sorry.

14. ## Re: Proof about linear transformations

Just as a matter of consistency of style with the rest of the proof:
S=(S1,0;0,1), T=(T1,0;0,T4) gives T4=r