Results 1 to 3 of 3

Math Help - General concepts regarding linear transformations and matrices

  1. #1
    Junior Member
    Joined
    Nov 2009
    Posts
    27

    General concepts regarding linear transformations and matrices

    Hi, am I right in thinking that for a L.T. T: U V represented by Au = v, u, v refers to the coordinates of a vector within vector spaces U, V with respect to that space’s basis, represented as a column vector? I’m getting a little confused since T(u) = v refers to u, v as vectors and not their coordinates, but then Au = v wouldn’t make sense in a polynomial space where, for example, u = α x²+ βx+ γ, unless u, v refers to the coordinates in column form.

    If so, does this mean that given a vector space V over a field K, we are essentially using elements of K as entries in a coordinate system with respect to the basis of V s.t. each distinct element in V has a unique coordinate representation or equivalently, is a unique L.C. of basis vectors?

    So does this mean that all spaces over the same field with the same dimension are isomorphic? Since, for example, the row vector (α , β, γ) in K³ and the polynomial α x²+ βx+ γ in K[x]≤2 both have the same coordinates (namely (α , β, γ)) and so a L.T. to coordinates (α’ , β’, γ’) would be represented by the same matrix for both spaces since the matrix only transforms one coordinate system to another while applying a given L.T. on the way.

    Thanks very much.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,306
    Thanks
    1282
    Quote Originally Posted by tunaaa View Post
    Hi, am I right in thinking that for a L.T. T: U V represented by Au = v, u, v refers to the coordinates of a vector within vector spaces U, V with respect to that space’s basis, represented as a column vector?

    No, you are not. u and v refer to vectors, not to their components. And it is not necessary for a vector space to be given a specific basis in order to talk about vectors in that space or about a linear transformation from one space to another.

    If I write the linear transformation as a matrix, and write, say,
    \begin{bmatrix}3 & 1 \\ 2 & 2\end{bmatrix}\begin{bmatrix}3 \\ 2\end{bmatrix}= \begin{bmatrix}11 \\ 10\end{bmatrix}, then I am thinking of \begin{bmatrix}3 \\ 1\end{bmatrix} as representing the vector 3u_1+ 1u_2 where u_1 and u_2 are basis vectors for the two dimensional space U and \begin{bmatrix}11 \\ 10\end{bmatrix} as representing the vector 11v_1+ 10v_2 where v_1 and v_2 are basis vectors for the two dimensional vector space V.
    It is only when we choose to represent vectors and linear transformations as matrices that the components in a specific basis become important.

    I’m getting a little confused since
    T(u) = v refers to u, v as vectors and not their coordinates, but then Au = v wouldn’t make sense in a polynomial space where, for example, u = α x²+ βx+ γ, unless u, v refers to the coordinates in column form.
    I don't see how you get that. I could, for example, take A to be the derivative operator on the space of polynomials of degree 2 or less: Au= A(\alpha x^2+ \beta x+ \gamma)= 2\alpha x+ \beta= v. In that u and v are vectors: u= \alpha x^2+ \beta x+ \gamma and v= 2\alpha x+ \beta.

    If I choose to write A in matrix form, taking as basis 1, x, x^2, in that order, then I would think of the polynomial \alpha x^2+ \beta x+ \gamma as represented by the column vector \begin{bmatrix}\alpha \\ \beta \\ \gamma\end{bmatrix}- notice that order of the vectors in the basis is also important- if I had taken the basis to be the same 3 functions but in order x^2, 1, x, that same polynomial would be represented by \begin{bmatrix}\gamma \\ \beta \\ \alpha \end{bmatrix}.

    Using the first order, This linear transformation would be represented as \begin{bmatrix}0 & 0 & 0 \\ 2 & 0 & 0 \\ 0 & 1 & 0\end{bmatrix}

    If so, does this mean that given a vector space V over a field K, we are essentially using elements of K as entries in a coordinate system with respect to the basis of V s.t. each distinct element in V has a unique coordinate representation or equivalently, is a unique L.C. of basis vectors?
    Well, we don't have to represent matrices as columns or rows of numbers but we can.

    So does this mean that all spaces over the same field with the same dimension are isomorphic? Since, for example, the row vector (α , β, γ) in K³ and the polynomial α x²+ βx+ γ in K[x]≤2 both have the same coordinates (namely (α , β, γ)) and so a L.T. to coordinates (α’ , β’, γ’) would be represented by the same matrix for both spaces since the matrix only transforms one coordinate system to another while applying a given L.T. on the way.

    Thanks very much.
    Yes, this is true! The fact that vectors in any n dimensional vector space over field K can be represented, upon choice of a basis, as ordered n-tuples of elements of K means that all vector spaces of the same finitedimension, over the same field, are isomorphic.

    You can demonstrate the isomorphism by choosing a specific basis, in a specific order, for each vector space and constructing a function that maps each basis vector into the corresponding basis in the other space, then extending it to other vectors "by linearity" T(\alpha_1u_1+ \alpha_2u_2+ \cdot\cdot\cdot+ \alpha_nu_n)= \alpha_1T(u_1)+ \alpha_2T(u_2)+ \cdot\cdot\cdot+ \alpha_nT(u_n).

    Notice the restriction to finite dimensional spaces. Infinite dimensional spaces (such as the space of all polynomials or the space of all continuous functions) even if their bases have the same cardinality, over the same field, are not necessarily isomorphic. "Linear Algebra" normally restricts itself to finite dimensional vector spaces (Halmos' famous text on Linear Algebra was titled "Finite Dimensional Vector Spaces") while infinite dimensional vector spaces,typically function spaces, are dealt with in "Functional Analysis".
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Nov 2009
    Posts
    27
    Thanks for your comprehensive response - it's totally clear now.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Linear Transformations and the General Linear Group
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: December 26th 2011, 10:50 AM
  2. Compute the matrices for the given linear transformations
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: October 9th 2010, 03:38 AM
  3. Do these matrices represent linear transformations?
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: May 26th 2010, 07:27 AM
  4. Matrices and linear transformations
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: April 22nd 2010, 04:21 AM
  5. Linear transformations and matrices
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: April 30th 2005, 04:14 PM

Search Tags


/mathhelpforum @mathhelpforum