# Linear Algebra: Linear Transformations(1)

• Dec 24th 2008, 10:25 PM
Isomorphism
Linear Algebra: Linear Transformations(1)
Hello,

I am teaching myself Linear Algebra following P.R.Halmos' "Finite Dimensional Vector Spaces".

I am stuck at a couple of problems:
-----------------------------------------------------------------------------------------------------------------------------

Problem (1)

Prove that corresponding to every linear transformation A on a finite dimensional vector space V, there exists an invertible linear transformation P such that APA = A.(Or equivalently prove that PA is a projection)

-----------------------------------------------------------------------------------------------------------------------------

Problem (2)

If A,B,C are linear transformations on a finite dimensional vector space, then does (AB - BA)^2 commutes with C always?

What happens when the dimension of the vector space is 2?

-----------------------------------------------------------------------------------------------------------------------------

Problem (3)

From the basic definition of a determinant of a linear transformation(see below), prove that $\displaystyle \text{det }(A \otimes B) = \text{det }(A) ^m \text{det }(B) ^n$, where A and B are linear transformations on vector spaces of dimensions n and m respectively

(I know the proof of a particular case when A and B are matrices, using eigenvalues. But I would love to see a proof using the below definition).

-----------------------------------------------------------------------------------------------------------------------------

Definition of Determinant of a linear transformation:

If W is the space of all alternating n-linear forms on an n-dimensional vector space V,then we know that it is one-dimensional. Thus given a linear transformation A, we have a scalar $\displaystyle \delta$ such that for every w in W, $\displaystyle w(Ax_1,Ax_2,.....,Ax_n) = \delta w(x_1,x_2,....,x_n)$$\displaystyle \forall x_1,x_2,...x_n$ vectors in V. Then the det(A) is defined to be the scalar $\displaystyle \delta$.

Thank you,
Srikanth
• Dec 25th 2008, 12:05 AM
NonCommAlg
Quote:

Originally Posted by Isomorphism
Hello,

I am teaching myself Linear Algebra following P.R.Halmos' "Finite Dimensional Vector Spaces".

I am stuck at a couple of problems:

Problem (1)

Prove that corresponding to every linear transformation A on a finite dimensional vector space V, there exists an invertible linear transformation P such that APA = A.(Or equivalently prove that PA is a projection)

the proof of this is in the book and it's quite easy! see Theorem 3, page 94.

Quote:

Problem (2)

If A,B,C are linear transformations on a finite dimensional vector space, then does (AB - BA)^2 commutes with C always?
no. it's easier to work with matrices rather than transformations here. they're the same anyway: $\displaystyle A=e_{12}, \ B=e_{21}, \ C=e_{13}.$ (here $\displaystyle e_{ij}$ has 1 as its (i,j)-entry and 0 everywhere else.)

Quote:

What happens when the dimension of the vector space is 2?
in this case, the claim is true: let $\displaystyle AB-BA=X.$ clearly $\displaystyle \text{tr}(X)=0.$ thus by Cayley Hamilton: $\displaystyle X^2 = \alpha I,$ for some scalar $\displaystyle \alpha.$ now it's obvious that $\displaystyle X^2C=CX^2,$ for all $\displaystyle C.$
• Dec 25th 2008, 12:47 AM
Isomorphism
Wow! This proof is fantastic (Cool).

As of now, AB and BA are linear transformations and he has not defined trace or proved Cayley-Hamilton theorem. So probably there is a elementary proof of this result.But I know both the results for matrices.

Nevertheless its a wonderful proof. Thank you NonCommAlg, you (Rock)

If you have any idea on how to do the third one, can you tell me?

Thanks again,
Srikanth
• Dec 25th 2008, 01:21 AM
NonCommAlg
here's a direct way for Problem 2, part 2: let $\displaystyle \{e_1,e_2 \}$ be a basis for our vector space. let $\displaystyle A(e_1)=ae_1 + be_2, \ A(e_2)=ce_1 + de_2, \ \ B(e_1)=a'e_1+b'e_2, \ B(e_2)=c'e_1 + d'e_2.$ see that:

$\displaystyle (AB-BA)(e_1)=re_2, \ \ (AB-BA)(e_2)=se_1,$ for some scalars $\displaystyle r,s.$ thus: $\displaystyle (AB-BA)^2(e_1)=rse_1, \ (AB-BA)^2(e_2)=rse_2.$ hence: $\displaystyle (AB-BA)^2=rsI$ and the result follows.