Do linear transformation always involve multiplying a Coefficient Matrix by a single column matrix or a single row matrix? Can a linear transformation be the produce of coefficient matrix and a coefficient matrix?
Do linear transformation always involve multiplying a Coefficient Matrix by a single column matrix or a single row matrix? Can a linear transformation be the produce of coefficient matrix and a coefficient matrix?
It's my understanding that any linear transformation can be represented as the multiplication of a vector by a matrix, resulting in another vector that generally (but not necessarily) is different from the "original" or "source" vector. There may be other ways to represent a linear transformation, but the matrix-vector representation is common.
As for your second question, the product of two matrices, provided the multiplication is defined, will yield another matrix. The new matrix can then be used to transform a vector, provided that the dimensions are right.
I hope this is clear.
Sort of. I know that if you have a linear transformation involving a matrix and a vector, you'd get a vector (as a consequence of matrix multiplication).
But what about the interpretation of a coefficient matrix by a coefficient matrix? I know that this can represent a composition of linear transformations. However, can it also represent the transformation of multiple vectors?
For example, let A and B be a 3x3 matrix.
A*B=C (C is also 3x3).
Are the column vectors of C the linear transformation (associated with coefficient matrix A) of the respective column vectors of B?
Hmm. Assuming we're doing classical matrix multiplication (I'll note in passing that other ways to multiply matrices do exist) then the column vectors of the product matrix are indeed the vectors we get if we multiply A by the columns of B one by one as you describe.
I've always thought that a linear transformation is something we do to a vector, and the product is another vector, not a matrix. What you get by multiplying two matrices is another tool for accomplishing another linear transformation. If you can show me that the distinction you make is useful or meaningful, I'll change my mind.
I don't know just where you are in LA, but maybe you know that square matrices can often by decomposed, i.e., represented as the product of other matrices.
In the plane, for instance, there are four isometries, plus dilations and shears. I believe that all linear transformations in the plane can be classified as one of these, or as some composition of them. It should be possible to decompose any linear transformation into a series of these.
A linear transformation is a function between two vector spaces V and w.
There is always a matrix that represents this transformation . A linear transformation maps lines into lines and polygons into polygons.
you may check here for more details.
Linear Transformation -- from Wolfram MathWorld
The definition of "linear transformation" is that A(x+y)= Ax+ Ay and A(rx)= rAx.
Every linear transformation from a vector space of dimension n to a vector space of dimension m can be represented by a matrix with n rows and m columns. However, linear transformations on infinite dimensional vector spaces cannot be represented by regular matrices. What we can do in the case of "function spaces" is represent the linear transformation u(x) to v(x) as $\displaystyle v(x)= \int_a^x K(x,t)u(t)dt$ with the kernel, K(x,t), taking the role of the matrix.