Linear & Matrix Transformations

I am beginning to learn about linear transformations and matrix-vector products, both vast subjects.

I have three questions that I have never found addressed, and I hope somebody out there can tell me or point me to the answers.

First, do there exist linear transformations in Rn that cannot be represented as matrix-vector products, where all entries are real numbers? So far as I know, there are none.

Second, do there exist matrix-vector products in Rn, where all entries are real numbers, that do not represent linear transformations? So far as I know, there are none.

Third, do there exist linear transformations in Rn, represented by matrix-vector products, that are neither isometries, dilations, nor shears? So far as I know, there are none.

I feel like I must be missing something, but I don't know what.

Thanks!

Re: Linear & Matrix Transformations

Quote:

Originally Posted by

**zhandele** I am beginning to learn about linear transformations and matrix-vector products, both vast subjects.

I have three questions that I have never found addressed, and I hope somebody out there can tell me or point me to the answers.

First, do there exist linear transformations in Rn that cannot be represented as matrix-vector products, where all entries are real numbers? So far as I know, there are none.

No. Every linear transfromation from Rn to Rn can be expressed as an n by n matrix. The exact matrix depends upon your choice of basis for Rn- though different matrices represent the same linear transformation in different bases if and only if they are "similar matrices".

Quote:

Second, do there exist matrix-vector products in Rn, where all entries are real numbers, that do not represent linear transformations? So far as I know, there are none.

Again, no. In fact, matrices **are** themselves linear transformations. Since any n-dimensional vector space is isomorphic to Rn, any given n by m matrix represents a linear transformation from a m-dimensional vector space to an n-dimensional vector space.

Quote:

Third, do there exist linear transformations in Rn, represented by matrix-vector products, that are neither isometries, dilations, nor shears? So far as I know, there are none.

Again, no. Every linear transformation is a combination of those three kinds of linear transformations.

Quote:

I feel like I must be missing something, but I don't know what.

Thanks!

Re: Linear & Matrix Transformations

it turns out that for a given pair of vector spaces, say U, where dim(U) = n, and V, where dim(V) = m (for this to make sense, these must both be vector spaces over the same field, F), that the set of all linear transformations from U to V (often denoted Hom(U,V)) is itself a vector space.

we define, for S,T in Hom(U,V), S+T by:

(S+T)(u) = S(u) + T(u) <--note the RHS addition is in V

and:

(aS)(u) = a(S(u)) <----this is the scalar multiplication of V.

this makes Hom(U,V) into a vector space. now for Mat_{mxn}(F), we obviously can add matrices:

if A = (a_{ij}) and B = (b_{ij}), then we set A+B = (a_{ij}+b_{ij}).

and we also have scalar multiples: cA = (ca_{ij}).

and we have the following important theorem: Hom(U,V) is isomorphic to Mat_{mxn}(F).

we will show this by actually displaying an isomorphism (note this isomorphism is NOT unique, as it depends on a choice of bases).

let {u_{1},...,u_{n}} be a basis for U, and let {v_{1},...,v_{m}} be a basis for V.

define the map: T_{ij} in Hom(U,V) by T_{ij}(u_{j}) = v_{i}, T_{ij}(u_{j}) = 0.

for example, with m = 3, n = 4, we have:

T_{21}(u) = T_{21}(a_{1}u_{1}+a_{2}u_{2}+a_{3}u_{3}+a_{4}u_{4}) =

a_{1}T_{21}(u_{1}) + a_{2}T_{21}(u_{2}) + a_{3}T_{21}(u_{3}) + a_{4}T_{21}(u_{4})

= a_{1}v_{2} + a_{2}0 + a_{3}0 + a_{4}0 = a_{1}v_{2}.

lemma #1: the {T_{ij}} form a basis for Hom(U,V).

suppose S is in Hom(U,V).

then S(u) = S(a_{1}u_{1}+...+a_{n}u_{n}) = b_{1}v_{1}+...+b_{m}v_{m}

in particular, for each j = 1,2,...,n we have:

S(u_{j}) = b_{1j}v_{1}+...+b_{mj}v_{m}

= b_{1j}T_{1j}(u_{j}) +...+ b_{mj}T_{mj}(u_{j}) = (b_{1j}T_{1j}+...+b_{mj}T_{mj})(u_{j}).

thus S(u) = a_{1}S(u_{1}) +...+ a_{n}S(u_{n})

= a_{1}(b_{11}T_{11}(u_{1})+...+b_{m1}T_{m1}(u_{1})) +...+ a_{n}(b_{1n}T_{1n}(u_{n})+...+b_{mn}T_{mn}(u_{n}))

= b_{11}[T_{11}(a_{1}u_{1})+...+T_{11}(a_{n}u_{n})] +...+ b_{m1}[T_{m1}(a_{1}u_{1})+...+T_{m1}(a_{n}u_{n})] +...+ b_{1n}[T_{1n}(a_{1}u_{1})+...+T_{1n}(a_{n}u_{n})] +...+ b_{mn}[T_{mn}(a_{1}u_{1})+...+T_{mn}(a_{n}u_{n})]

(remember T_{ij}(u_{k}) = 0 if j ≠ k, so the "extra terms" i've added are all 0's).

= b_{11}T_{11}(u)+...+b_{m1}T_{m1}(u)+...+b_{1n}T_{1n}(u)+...+b_{mn}T_{mn}(u)

= (b_{11}T_{11}+...+b_{1n}T_{1n}+...+b_{m1}T_{m1}+...+b_{mn}T_{mn})(u)

that is, any S in Hom(U,V) is a linear combination of the T_{ij}, so these span Hom(U,V).

next, suppose that c_{11}T_{11}+...+c_{mn}T_{mn} = 0 (the 0-map of Hom(U,V)).

then (c_{11}T_{11}+...+c_{mn}T_{mn})(u_{j}) = 0 (the 0-vector of V), so

(c_{1j}T_{1j}+...+c_{mj}T_{mj})(u_{j}) = 0 (these are the only terms which survive, since T_{ij}(u_{k}) = 0, for j ≠ k) so

c_{1j}v_{1}+...+c_{mj}v_{m} = 0, for EACH j, and by the linear independence of the v_{j}:

c_{1j} = ...= c_{mj} = 0, for all j, so the T_{ij} are linearly independent, and thus a basis for Hom(U,V).

now let E_{ij} be the mxn matrix with the ij-th entry 1, and everything else 0. the rest of the proof is simply:

ψ:Hom(U,V)→Mat_{mxn}(F) given by: ψ(T_{ij}) = E_{ij} is a linear isomorphism.

it suffices to prove that ψ is bijective, since our definition of ψ automatically makes it linear.

but {T_{ij}} has the same cardinality as {E_{ij}}, mn, so it is a bijection (two finite-dimensional vector spaces are isomorphic if and only if their dimensions are equal).

(if you want to prove the E_{ij} form a basis for Mat_{mxn}(F), knock yourself out...it's essentially the same proof as for the T_{ij}...that is, we're using the fact that:

*as vector spaces*, Mat_{mxn}(F) ≅ F^{mn} <--this is called "the vectorization of matrices" you just make one long column out of a matrix, by putting the 2nd column below the first, the 3rd below the 2nd, etc)