# Dimension of Transformations and Transformations in General

#### Kyo

Hi, I am having troubles even starting this problem and any help would be appreciated.

1) The question is: If B is an nxn matrix, X = {A in M(mxn) matrix space | AB = 0} and Y = {AB | A in M(mxn) matrix space}, show that X and Y are subpaces of M(mxn) and that dim(X) + dim(Y) = mn.

What I have) From observation, I can see that X resembles a null space or kernel of a T, and Y resembles an image space or image of a T. And partially, I can see that dim(X) + dim(Y) = mn = dim(M(mxn) matrix space).

2) As a related question, can someone explain to me how to show if a transformation is one to one and/or onto? Suppose T: V-->W: I know that one to one means that the ker(T) = {0}, does that mean when a T is one to one, there are no vectors that meet the requirement T(v) = 0? And likewise, for onto, not every vector w in W can be mapped via T(v)?

As always, thank you for any help. My main focus is the initial problem, and the explanation for one to one/onto is secondary; if there are any sources you may point me to for the explanation, please do - I just hope for a different way to approach the thinking since I am quite confused.

#### chiro

MHF Helper
Hey Kyo.

I don't have an actual answer or proof for you but I know these results have been proved in multi-linear or tensor algebra.

The intuition for me is that you are doing a composition of functions and that you map one thing to something with dim(X) and then map each element to its dimension of dim(Y) for each component just like a composition of functions.

I guess what you could show is that you start with specific vectors of certain length and if you have z vectors in your matrix with n linearly independent vectors, then the total dimension if every element of each vector is independent will have z*n independent components.

Now consider the multiplication of the matrices in that both matrices are independent which means that the dimension doesn't reduce for the shared system (i.e. no linear dependencies between the entries of both matrices) and the cardinality of the dimensions is multiplied since you are taking the permutation of the element in one system with another (like a Cartesian product of sorts but the (a,b) denotes a from one matrix and b from the other).

Again formalizing this would be done in tensor analysis and multi-linear matrix products but the intuition is the same as the cardinality idea of the Cartesian product.

#### Kyo

Thanks chiro, but sadly that seems too advanced and I haven't come across it. The concepts that I know and am trying to apply to this are kernels and images, onto/one-to-one, and/or basic composition properties.