Results 1 to 9 of 9

Math Help - matrices of linear equations

  1. #1
    Member
    Joined
    Dec 2010
    Posts
    107

    matrices of linear equations

    Let V be an n-dimensional vector space, and assume V=U\oplus W for subspaces U and W. Suppose T:V\rightarrow V is a linear transformation such that T(U)\subseteq U and T(W)\subseteq W.
    Show that the matrix of T with respect to a basis of V that contains bases of U and W, has block shape:
    \[ \left( \begin{array}{cc}<br />
A & 0 \\<br />
0 & B  \end{array} \right)\]

    I am having trouble seeing how the matrix that represents T would be a 2x2 matrix.
    So far in my working I have set the basis of U to be {v_1,v_2,...,v_k} and the basis of W to be v_{k+1},v_{k+2},...,v_n and I can see that under the T(u) (where u\in U) can be written as a linear combination of the elements in the basis. But I can't see where to go from there. Help would really be appreciated.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor Bruno J.'s Avatar
    Joined
    Jun 2009
    From
    Canada
    Posts
    1,266
    Thanks
    1
    Awards
    1
    No, it's not a 2\times 2 matrix! Perhaps it wasn't written clearly in your book. Here A,B denote square matrices of dimensions k,l (where k=\dim U, l=\dim W, and 0 denotes a matrix of appropriate size filled with 0's.

    You have the correct first step. Now by assumption, T(v_1),\dots, T(v_k) can be written with v_1, \dots, v_k only... what will that look like in the matrix?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Dec 2010
    Posts
    107
    So:
    T(v_1)=a_{11}v_1 + a_{12}v_2 +...+a_{1k}v_k and so on for each v_i. So the matrix will be:
    A=[a_{ij}] where i and j represent the i-th row and the j-th column.
    Then I could repeat this step for the elements in the basis of W. But I still can't see how that will give the shape given.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by worc3247 View Post
    So:
    T(v_1)=a_{11}v_1 + a_{12}v_2 +...+a_{1k}v_k and so on for each v_i. So the matrix will be:
    A=[a_{ij}] where i and j represent the i-th row and the j-th column.
    Then I could repeat this step for the elements in the basis of W. But I still can't see how that will give the shape given.

    This is called being reduced and has a much more general form (if you're interested see here for a proof of the analogous theorem). The idea is that if \{x_1,\cdots,x_m\} is a basis for U and \{x_{m+1},\cdots,x_n\} is a basis for W then \{x_1,\cdots,x_m,x_{m+1},\cdots,x_{n}\} is a basis for V. But, since T(x_k)\in U for k=1,\cdots,m conclude that \alpha_{i,k}=0 for k=1,\cdots,m and i=m+1,\cdots,n. Can you finish?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Member
    Joined
    Dec 2010
    Posts
    107
    Quote Originally Posted by Drexel28 View Post
    conclude that \alpha_{i,k}=0 for k=1,\cdots,m and i=m+1,\cdots,n. Can you finish?
    Ok, I understood you up until this part. Where did the  \alpha come from and why does it equal 0 for those values?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by worc3247 View Post
    Ok, I understood you up until this part. Where did the  \alpha come from and why does it equal 0 for those values?
    For an ordered basis \mathcal{B}=(x_1,\cdots,x_n) we know that there exists unique scalars \alpha_{i,j},\text{ }i,j\in[n] such that \displaystyle T(x_j)=\sum_{i=1}^{n}\alpha_{i,j}x_i. We then define the matrix \left[T\right]_{\mathcal{B}} to be the one whose (i,j)^{\text{th}} coordinate is \alpha_{i,j}, right? But, take x_1 for example (going back to our particular case). We know that \displaystyle T(x_1)=\sum_{i=1}^{n}\alpha_{i,1}x_i\in U. But, for this to be true we must have that \alpha_{m+1,1},\cdots,\alpha_{n,1}=0. Make sense?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Member
    Joined
    Dec 2010
    Posts
    107
    And likewise for the subspace W you will have \alpha_{1,m+1},\alpha_{1,m+2},...,\alpha_{1,n} = 0 which would then give the matrix shape needed? I think...
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,598
    Thanks
    1421
    Same thing, said in a little different way: you have a basis \{v_1, v_2, ..., v_k\} for U and a basis \{v_{k+1}, v_{k+2}, ..., v_n\} for W as you said originally. Of course, \{v_1, v_2, ..., v_k, v_{k+1}, ..., v_n\} is a basis for V= U\oplus V.

    You find the matrix representing T, in this basis, by applying T to each basis vector in turn. The coefficients of the expansion of [tex]T(v_i) in those basis vectors give the ith column of the matrix. In particular, T(v_1)= a_{11}v_1+ a_{12}v_2+ ...+ a_{1k}v_{k}+ 0v_{k+1}+ ...+ 0v_n because T(U)\subset U- each of the vectors v_1, v_2, ..., v_k is in U so T(v_1), T(v_2), ..., T(v_k) are also in U and have no component in V. That's why there are only "0"s for the k+ 1 to rows of the first k columns.

    Similarly, T(v_{k+1}), ..., T(v_n), because v_{k+1}, ..., v_n are all in V, and T(V)\subset V, are in V, T(v_{k+1}), ..., T(v_n) are all in V and so have no component in U. The coefficents of v_1, ..., v_k are all 0 and so the first k columns of rows k+ 1 to n are all 0.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Member
    Joined
    Dec 2010
    Posts
    107
    Got it! Thank you so much!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Using matrices to solve systems of linear equations.
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: November 15th 2011, 06:07 PM
  2. Matrices - Linear equations
    Posted in the Pre-Calculus Forum
    Replies: 9
    Last Post: January 13th 2011, 02:40 PM
  3. Linear Equations and Matrices
    Posted in the Algebra Forum
    Replies: 1
    Last Post: March 3rd 2010, 08:51 AM
  4. linear systems of equations/ matrices
    Posted in the Algebra Forum
    Replies: 1
    Last Post: December 11th 2008, 08:27 PM
  5. Replies: 1
    Last Post: September 2nd 2007, 06:07 AM

Search Tags


/mathhelpforum @mathhelpforum