# Thread: matrices of linear equations

1. ## matrices of linear equations

Let V be an n-dimensional vector space, and assume $V=U\oplus W$ for subspaces U and W. Suppose $T:V\rightarrow V$ is a linear transformation such that $T(U)\subseteq U$ and $T(W)\subseteq W$.
Show that the matrix of T with respect to a basis of V that contains bases of U and W, has block shape:
$$\left( \begin{array}{cc} A & 0 \\ 0 & B \end{array} \right)$$

I am having trouble seeing how the matrix that represents T would be a 2x2 matrix.
So far in my working I have set the basis of U to be ${v_1,v_2,...,v_k}$ and the basis of W to be $v_{k+1},v_{k+2},...,v_n$ and I can see that under the T(u) (where $u\in U$) can be written as a linear combination of the elements in the basis. But I can't see where to go from there. Help would really be appreciated.

2. No, it's not a $2\times 2$ matrix! Perhaps it wasn't written clearly in your book. Here $A,B$ denote square matrices of dimensions $k,l$ (where $k=\dim U$, $l=\dim W$, and $0$ denotes a matrix of appropriate size filled with $0$'s.

You have the correct first step. Now by assumption, $T(v_1),\dots, T(v_k)$ can be written with $v_1, \dots, v_k$ only... what will that look like in the matrix?

3. So:
$T(v_1)=a_{11}v_1 + a_{12}v_2 +...+a_{1k}v_k$ and so on for each $v_i$. So the matrix will be:
$A=[a_{ij}]$ where i and j represent the i-th row and the j-th column.
Then I could repeat this step for the elements in the basis of W. But I still can't see how that will give the shape given.

4. Originally Posted by worc3247
So:
$T(v_1)=a_{11}v_1 + a_{12}v_2 +...+a_{1k}v_k$ and so on for each $v_i$. So the matrix will be:
$A=[a_{ij}]$ where i and j represent the i-th row and the j-th column.
Then I could repeat this step for the elements in the basis of W. But I still can't see how that will give the shape given.

This is called being reduced and has a much more general form (if you're interested see here for a proof of the analogous theorem). The idea is that if $\{x_1,\cdots,x_m\}$ is a basis for $U$ and $\{x_{m+1},\cdots,x_n\}$ is a basis for $W$ then $\{x_1,\cdots,x_m,x_{m+1},\cdots,x_{n}\}$ is a basis for $V$. But, since $T(x_k)\in U$ for $k=1,\cdots,m$ conclude that $\alpha_{i,k}=0$ for $k=1,\cdots,m$ and $i=m+1,\cdots,n$. Can you finish?

5. Originally Posted by Drexel28
conclude that $\alpha_{i,k}=0$ for $k=1,\cdots,m$ and $i=m+1,\cdots,n$. Can you finish?
Ok, I understood you up until this part. Where did the $\alpha$ come from and why does it equal 0 for those values?

6. Originally Posted by worc3247
Ok, I understood you up until this part. Where did the $\alpha$ come from and why does it equal 0 for those values?
For an ordered basis $\mathcal{B}=(x_1,\cdots,x_n)$ we know that there exists unique scalars $\alpha_{i,j},\text{ }i,j\in[n]$ such that $\displaystyle T(x_j)=\sum_{i=1}^{n}\alpha_{i,j}x_i$. We then define the matrix $\left[T\right]_{\mathcal{B}}$ to be the one whose $(i,j)^{\text{th}}$ coordinate is $\alpha_{i,j}$, right? But, take $x_1$ for example (going back to our particular case). We know that $\displaystyle T(x_1)=\sum_{i=1}^{n}\alpha_{i,1}x_i\in U$. But, for this to be true we must have that $\alpha_{m+1,1},\cdots,\alpha_{n,1}=0$. Make sense?

7. And likewise for the subspace W you will have $\alpha_{1,m+1},\alpha_{1,m+2},...,\alpha_{1,n} = 0$ which would then give the matrix shape needed? I think...

8. Same thing, said in a little different way: you have a basis $\{v_1, v_2, ..., v_k\}$ for U and a basis $\{v_{k+1}, v_{k+2}, ..., v_n\}$ for W as you said originally. Of course, $\{v_1, v_2, ..., v_k, v_{k+1}, ..., v_n\}$ is a basis for $V= U\oplus V$.

You find the matrix representing T, in this basis, by applying T to each basis vector in turn. The coefficients of the expansion of [tex]T(v_i) in those basis vectors give the ith column of the matrix. In particular, $T(v_1)= a_{11}v_1+ a_{12}v_2+ ...+ a_{1k}v_{k}+ 0v_{k+1}+ ...+ 0v_n$ because $T(U)\subset U$- each of the vectors $v_1, v_2, ..., v_k$ is in U so $T(v_1), T(v_2), ..., T(v_k)$ are also in U and have no component in V. That's why there are only "0"s for the k+ 1 to rows of the first k columns.

Similarly, $T(v_{k+1}), ..., T(v_n)$, because $v_{k+1}, ..., v_n$ are all in V, and $T(V)\subset V$, are in V, $T(v_{k+1}), ..., T(v_n)$ are all in V and so have no component in U. The coefficents of $v_1, ..., v_k$ are all 0 and so the first k columns of rows k+ 1 to n are all 0.

9. Got it! Thank you so much!