# Thread: Linear & Matrix Transformations

1. ## Linear & Matrix Transformations

I am beginning to learn about linear transformations and matrix-vector products, both vast subjects.

I have three questions that I have never found addressed, and I hope somebody out there can tell me or point me to the answers.

First, do there exist linear transformations in Rn that cannot be represented as matrix-vector products, where all entries are real numbers? So far as I know, there are none.

Second, do there exist matrix-vector products in Rn, where all entries are real numbers, that do not represent linear transformations? So far as I know, there are none.

Third, do there exist linear transformations in Rn, represented by matrix-vector products, that are neither isometries, dilations, nor shears? So far as I know, there are none.

I feel like I must be missing something, but I don't know what.

Thanks!

2. ## Re: Linear & Matrix Transformations

Originally Posted by zhandele
I am beginning to learn about linear transformations and matrix-vector products, both vast subjects.

I have three questions that I have never found addressed, and I hope somebody out there can tell me or point me to the answers.

First, do there exist linear transformations in Rn that cannot be represented as matrix-vector products, where all entries are real numbers? So far as I know, there are none.
No. Every linear transfromation from Rn to Rn can be expressed as an n by n matrix. The exact matrix depends upon your choice of basis for Rn- though different matrices represent the same linear transformation in different bases if and only if they are "similar matrices".

Second, do there exist matrix-vector products in Rn, where all entries are real numbers, that do not represent linear transformations? So far as I know, there are none.
Again, no. In fact, matrices are themselves linear transformations. Since any n-dimensional vector space is isomorphic to Rn, any given n by m matrix represents a linear transformation from a m-dimensional vector space to an n-dimensional vector space.

Third, do there exist linear transformations in Rn, represented by matrix-vector products, that are neither isometries, dilations, nor shears? So far as I know, there are none.
Again, no. Every linear transformation is a combination of those three kinds of linear transformations.

I feel like I must be missing something, but I don't know what.

Thanks!

3. ## Re: Linear & Matrix Transformations

it turns out that for a given pair of vector spaces, say U, where dim(U) = n, and V, where dim(V) = m (for this to make sense, these must both be vector spaces over the same field, F), that the set of all linear transformations from U to V (often denoted Hom(U,V)) is itself a vector space.

we define, for S,T in Hom(U,V), S+T by:

(S+T)(u) = S(u) + T(u) <--note the RHS addition is in V

and:

(aS)(u) = a(S(u)) <----this is the scalar multiplication of V.

this makes Hom(U,V) into a vector space. now for Matmxn(F), we obviously can add matrices:

if A = (aij) and B = (bij), then we set A+B = (aij+bij).

and we also have scalar multiples: cA = (caij).

and we have the following important theorem: Hom(U,V) is isomorphic to Matmxn(F).

we will show this by actually displaying an isomorphism (note this isomorphism is NOT unique, as it depends on a choice of bases).

let {u1,...,un} be a basis for U, and let {v1,...,vm} be a basis for V.

define the map: Tij in Hom(U,V) by Tij(uj) = vi, Tij(uj) = 0.

for example, with m = 3, n = 4, we have:

T21(u) = T21(a1u1+a2u2+a3u3+a4u4) =

a1T21(u1) + a2T21(u2) + a3T21(u3) + a4T21(u4)

= a1v2 + a20 + a30 + a40 = a1v2.

lemma #1: the {Tij} form a basis for Hom(U,V).

suppose S is in Hom(U,V).

then S(u) = S(a1u1+...+anun) = b1v1+...+bmvm

in particular, for each j = 1,2,...,n we have:

S(uj) = b1jv1+...+bmjvm

= b1jT1j(uj) +...+ bmjTmj(uj) = (b1jT1j+...+bmjTmj)(uj).

thus S(u) = a1S(u1) +...+ anS(un)

= a1(b11T11(u1)+...+bm1Tm1(u1)) +...+ an(b1nT1n(un)+...+bmnTmn(un))

= b11[T11(a1u1)+...+T11(anun)] +...+ bm1[Tm1(a1u1)+...+Tm1(anun)] +...+ b1n[T1n(a1u1)+...+T1n(anun)] +...+ bmn[Tmn(a1u1)+...+Tmn(anun)]

(remember Tij(uk) = 0 if j ≠ k, so the "extra terms" i've added are all 0's).

= b11T11(u)+...+bm1Tm1(u)+...+b1nT1n(u)+...+bmnTmn(u)

= (b11T11+...+b1nT1n+...+bm1Tm1+...+bmnTmn)(u)

that is, any S in Hom(U,V) is a linear combination of the Tij, so these span Hom(U,V).

next, suppose that c11T11+...+cmnTmn = 0 (the 0-map of Hom(U,V)).

then (c11T11+...+cmnTmn)(uj) = 0 (the 0-vector of V), so

(c1jT1j+...+cmjTmj)(uj) = 0 (these are the only terms which survive, since Tij(uk) = 0, for j ≠ k) so

c1jv1+...+cmjvm = 0, for EACH j, and by the linear independence of the vj:

c1j = ...= cmj = 0, for all j, so the Tij are linearly independent, and thus a basis for Hom(U,V).

now let Eij be the mxn matrix with the ij-th entry 1, and everything else 0. the rest of the proof is simply:

ψ:Hom(U,V)→Matmxn(F) given by: ψ(Tij) = Eij is a linear isomorphism.

it suffices to prove that ψ is bijective, since our definition of ψ automatically makes it linear.

but {Tij} has the same cardinality as {Eij}, mn, so it is a bijection (two finite-dimensional vector spaces are isomorphic if and only if their dimensions are equal).

(if you want to prove the Eij form a basis for Matmxn(F), knock yourself out...it's essentially the same proof as for the Tij...that is, we're using the fact that:

as vector spaces, Matmxn(F) ≅ Fmn <--this is called "the vectorization of matrices" you just make one long column out of a matrix, by putting the 2nd column below the first, the 3rd below the 2nd, etc)