1. ## Vectors and Subspaces

Hey all,

The question states, " Find a basis for the solution space in R^5 of this linear system. Find this subspace's dimension." The respective linear system is as follows:
$
\begin{bmatrix}x_1&2x_2&3x_3&4x_4&5x_5\\2x_1&-x_2&2x_3&-x_4&2x_5\\0&5x_2&4x_3&9x_4&8x_5\\x_1&7x_2&7x_3&13x _4&13x_5\end{bmatrix}$
= $\begin{bmatrix}0\\0\\0\\0\end{bmatrix}$

What I've done so far:

I've used Gauss-Jordan Complete Elimination to reduce the augmented matrix...(can't go past that)

What I don't understand:

I know that I should use the reduced matrix to find the basis of the linear system...But I don't know how to do this...

Also, what does it mean "subspace's dimension"?

Thank you to all providing assistance

2. Originally Posted by tsal15
Hey all,

The question states, " Find a basis for the solution space in R^5 of this linear system. Find this subspace's dimension." The respective linear system is as follows:
$
\begin{bmatrix}x_1&2x_2&3x_3&4x_4&5x_5\\2x_1&-x_2&2x_3&-x_4&2x_5\\0&5x_2&4x_3&9x_4&8x_5\\x_1&7x_2&7x_3&13x _4&13x_5\end{bmatrix}$
= $\begin{bmatrix}0\\0\\0\\0\end{bmatrix}$
Dear tsal15,

The question is not clear at all. How can a 4 x 5 matrix equal a 5 x 1 vector?

Do you mean the matrix representation of system of equations:

You wanted to solve Ax = 0,

$A = \begin{bmatrix}1 & 2 & 3 & 4 & 5\\2 & -1 & 2 & -1 & 2 \\0 & 5 & 4 & 9 & 8 \\ 1 & 7 & 7 & 13 & 13\end{bmatrix}$

What I've done so far:

I've used Gauss-Jordan Complete Elimination to reduce the augmented matrix...(can't go past that)
First of all do you realise that the solution space is actually the nullspace of matrix A.

Can you write the Gauss-Jordan reduced Matrix here?

What I don't understand:

I know that I should use the reduced matrix to find the basis of the linear system...But I don't know how to do this...

Also, what does it mean "subspace's dimension"?

Thank you to all providing assistance
Well the solution space is a finite dimensional vector space and thus it has a dimension. Its actually the nullity of A....

3. Hey Isomorphism,

Originally Posted by Isomorphism
Dear tsal15,

The question is not clear at all. How can a 4 x 5 matrix equal a 5 x 1 vector?

Do you mean the matrix representation of system of equations:

You wanted to solve Ax = 0,

$A = \begin{bmatrix}1 & 2 & 3 & 4 & 5\\2 & -1 & 2 & -1 & 2 \\0 & 5 & 4 & 9 & 8 \\ 1 & 7 & 7 & 13 & 13\end{bmatrix}$
Yes, I apologize I think I didn't provide enough information. But, yes, I need to solve Ax = 0, where A is the matrix you typed above.

First of all do you realise that the solution space is actually the nullspace of matrix A.
When you refer to the solution space, do you mean the 'x'? If so, yes I've realised that x = 0, right?

Can you write the Gauss-Jordan reduced Matrix here?

$\begin{bmatrix}1 & 0 & 3 & 4 & 5|0\\0 & 1 & \frac{4}{5} & \frac{9}{5} & \frac{8}{5}|0 \\0 & 0 & 0 & 0 & 0|0 \\ 0 & 0 & 0 & 0 & 0|0\end{bmatrix}$

Well the solution space is a finite dimensional vector space and thus it has a dimension. Its actually the nullity of A....
I understand $A[x_1 + x_2] = Ax_1 + Ax_2 = 0 + 0 = 0$ , where x = 0 and x is the vector solution...

Thanks for your continued help, Isomorphism

4. Originally Posted by tsal15
Hey Isomorphism,

Yes, I apologize I think I didn't provide enough information. But, yes, I need to solve Ax = 0, where A is the matrix you typed above.

When you refer to the solution space, do you mean the 'x'? If so, yes I've realised that x = 0, right?
No, you should not realize that! Certainly x= 0 is always a solution to Ax= 0 but if A not an invertible matrix, that is not the only such solution. The set of all solutions to Ax= 0 forms a subspace just as this problem implies.

$\begin{bmatrix}1 & 0 & 3 & 4 & 5|0\\0 & 1 & \frac{4}{5} & \frac{9}{5} & \frac{8}{5}|0 \\0 & 0 & 0 & 0 & 0|0 \\ 0 & 0 & 0 & 0 & 0|0\end{bmatrix}$

I understand $A[x_1 + x_2] = Ax_1 + Ax_2 = 0 + 0 = 0$ , where x = 0 and x is the vector solution...
Once again, yes, x= 0 is a solution. No, it doesn't follow that x= 0 is the only solution.

Thanks for your continued help, Isomorphism
From your reduce matrix above you are saying that
$\begin{bmatrix}1 & 0 & 3 & 4 & 5\\0 & 1 & \frac{4}{5} & \frac{9}{5} & \frac{8}{5} \\0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0\end{bmatrix}\begin{bmatrix}x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0 \\ 0 \\ 0\end{bmatrix}$
which means you must have $x_1+ 3x_3+ 4x_4+ 5x_5= 0$, $x_2+ \frac{4}{5}x_3+ \frac{9}{5}x_4+ \frac{8}{5}x_5= 0$.

You have two equations in 5 unknown values so you can expect to solve for two of them in terms of the other 3. For example, we can easily write $x_1= -3x_3- 4x_4- 5x_5$ and $x_2= -\frac{4}{5}x_3- \frac{9}{5}x_4- \frac{8}{5}x_5= 0$. So we can choose $x_3$, $x_4$, and $x_5$ to be any numbers we want and then solve for $x_1$ and $x_2$. That tells us that this subspace (the null space of matrix A) has dimension 3.

More specifically, if we take $x_3= 1$, $x_4= x_5= 0$, then $x_1= -3$ and $x_2= -\frac{4}{5}$ so $\left[-3, \frac{4}{5}, 1, 0, 0\right]$ is a vector in that subspace.
If we take $x_4= 1$, $x_3= x_5= 0$, then $x_1= -4$ and $-\frac{9}{5}$ so $\left[-4, -\frac{9}{5},0, 1, 0\right]$ is another vector in that subspace.
Finally, if we take $x_5= 1$, $x_2= x_3= 0$, then $x_1= -5$ and $x_2= -\frac{8}{5}$ so that $\left[-5, -\frac{8}{5}, 0, 0, 1\right]$ is a third vector in that subspace.

And, by using "0"s and "1" as we did, we have guaranteed that they are independent: The set of vectors { $\left[-3, \frac{4}{5}, 1, 0, 0\right], \left[-4, -\frac{9}{5},0, 1, 0\right], \left[-5, -\frac{8}{5}, 0, 0, 1\right]$} is a basis for the subspace. Since there are three vectors in that basis, the subspace has dimension 3.

5. Dear Halls of Ivy,

Unfortunately, I had given you an incorrect reduced-row echelon form matrix of linear equations...

In fact, the final matrix should look like the following:

$
\begin{bmatrix}1 & 0 & \frac{7}{5} & \frac{2}{5} & \frac{9}{5}|0\\0 & 1 & \frac{4}{5} & \frac{9}{5} & \frac{8}{5}|0 \\0 & 0 & 0 & 0 & 0|0\\ 0 & 0 & 0 & 0 & 0|0\end{bmatrix}$

Why have you substituted in the values: 0 (zero) and 1 (including -1), into the equations? Also how does this confirm that it is linearly independent?

I'm sure that the new matrix has a basis in dimensions of 3, (correct way to put it?).

My workings out:

For $x_3, x_4 and x_5$ I've assigned a,b &c to them respectively (this is the way my textbook doing it...).

So, the solution set is as follows:

$\frac{-7a}{5} - \frac{2b}{5} - \frac{9c}{5}$
$\frac{-4a}{5} - \frac{9b}{5} - \frac{8c}{5}$
$a$
$b$
$c$

Therefore, (and its a bit obvious) there exists 3 vectors in the subspace that form the basis...

Now, how do I know if it is linearly dependent?

Thanks,

tsal15

P.S. Could you double check my workings, please?

6. Originally Posted by tsal15
Why have you substituted in the values: 0 (zero) and 1 (including -1), into the equations? Also how does this confirm that it is linearly independent?
Well try applying the definition of linear independence to see if he is correct... Carefully observe that it is the last three co-ordinates of the vectors forcing the co-efficients to be 0. Do you realise the significance of his choice now?

Originally Posted by tsal15
I'm sure that the new matrix has a basis in dimensions of 3, (correct way to put it?).
Thats wrong way to put it. Matrix having basis does not make sense...

You have to talk about dimesion for a vector space... So in this case you will be right if you say the null space has dimension 3.

Originally Posted by tsal15
So, the solution set is as follows:

$\frac{-7a}{5} - \frac{2b}{5} - \frac{9c}{5}$
$\frac{-4a}{5} - \frac{9b}{5} - \frac{8c}{5}$
$a$
$b$
$c$

Therefore, (and its a bit obvious) there exists 3 vectors in the subspace that form the basis...Now, how do I know if it is linearly dependent?
You want them to be indepedent, dont you?

So choose the "obvious" vectors and prove they are linearly independent, by applying the definition.

P.S: Sorry I was away for a few days. I shall be visiting the forum less frequently since my second semester has started. Good Luck

7. Originally Posted by Isomorphism
Well try applying the definition of linear independence to see if he is correct... Carefully observe that it is the last three co-ordinates of the vectors forcing the co-efficients to be 0. Do you realise the significance of his choice now?
My understanding of the definition of linear independency is that the addition of the scalar sets multiplied by their respective vectors will equal zero i.e. $x_1v_1 + x_2v_2 + ... + x_mv_m = 0$ which in other words means that the coefficients should = 0...Am I right?

You want them to be indepedent, dont you?
No, I spoke to my teacher (who really is more useful as a stick than as anything else), he hinted that they the vectors were linearly dependent.

i.e. These vectors: $\begin{bmatrix} 1 \\ 2 \\ 0 \\ 1\end{bmatrix}$ $\begin{bmatrix} 2 \\ -1 \\ 5 \\ 7 \end{bmatrix}$ $\begin{bmatrix} 3 \\ 2 \\ 4 \\ 7 \end{bmatrix}$ $\begin{bmatrix} 4 \\ -1 \\ 9 \\ 13 \end{bmatrix}$ $\begin{bmatrix} 5 \\ 2 \\ 8 \\ 13\end{bmatrix}$ are supposed to be linearly dependent...hence dependent

I went about doing this (proving the vectors are linearly dependent...) and this is what I got:

It is a known fact that a 'n x m' matrix is linearly dependent if m > n ... so we have to show WHY ... by definition, if a system of vectors are linearly dependent, then there is at least one linear combination of one vector and another. We know that $x_1v_1 + x_2v_2 + x_3v_3 + x_4v_4 + x_5v_5 = 0$ rearrange to make v_3 the subject => $v_3 = \frac{-(x_1v_1 + x_2v_2 + x_4v_4 + x_5v_5)}{x_3} = 0$, so long as $x_3$ doesn't equal zero ... I then substitute the equations i found earlier that are part of the solution set...And then because $v_3$ only needs to be a combination of 1 or more other vector, I've selected that $x_3 = -1$ and $x_4 = x_5 = 0$. And now the equation is in terms of $v_1$ and $v_2$ . Is this correct?

P.S: Sorry I was away for a few days. I shall be visiting the forum less frequently since my second semester has started. Good Luck
Its more than understandable Thank you for your continued help And good luck with your second semester

If there is anyone out there who can help, as well, please feel free

8. Originally Posted by tsal15
My understanding of the definition of linear independency is that the addition of the scalar sets multiplied by their respective vectors will equal zero i.e. $x_1v_1 + x_2v_2 + ... + x_mv_m = 0$ which in other words means that the coefficients should = 0...Am I right?
Yes. If $x_1v_1 + x_2v_2 + ... + x_mv_m = 0$ then $x_i = 0 \forall i$

Originally Posted by tsal15
No, I spoke to my teacher (who really is more useful as a stick than as anything else), he hinted that they the vectors were linearly dependent.

i.e. These vectors: $\begin{bmatrix} 1 \\ 2 \\ 0 \\ 1\end{bmatrix}$ $\begin{bmatrix} 2 \\ -1 \\ 5 \\ 7 \end{bmatrix}$ $\begin{bmatrix} 3 \\ 2 \\ 4 \\ 7 \end{bmatrix}$ $\begin{bmatrix} 4 \\ -1 \\ 9 \\ 13 \end{bmatrix}$ $\begin{bmatrix} 5 \\ 2 \\ 8 \\ 13\end{bmatrix}$ are supposed to be linearly dependent...hence dependent
Yes the above vectors are clearly linearly dependent... I think you are missing track of the original question. Question was to find a basis and the dimension of the subspace of vectors x, that satisfied Ax = 0 (for the A matrix we have written before).

We told you that the subspace of vectors we need, is popularly called the null space of A. And looking at the Canonical form we can see that the rank of the matrix is 2. Thus nullity is 3 and hence dimension of nullspace of A is 3. Thus the dimension of the subspace of vectors x, that satisfied Ax = 0 is 3.

To find the basis, we reduced the matrix to canonical form and saw that the new system of equations has only two equations with 5 variables. We(like HallsofIvy) choose x1, x2, x3, x4 , x5 three times such that we get three linearly independent vectors that satisfy the new system of equations. This set is maximal since the dimension of null space is 3.

Originally Posted by tsal15
I went about doing this (proving the vectors are linearly dependent...) and this is what I got:

It is a known fact that a 'n x m' matrix is linearly dependent if m > n ... so we have to show WHY ... by definition, if a system of vectors are linearly dependent, then there is at least one linear combination of one vector and another. We know that $x_1v_1 + x_2v_2 + x_3v_3 + x_4v_4 + x_5v_5 = 0$ rearrange to make v_3 the subject => $v_3 = \frac{-(x_1v_1 + x_2v_2 + x_4v_4 + x_5v_5)}{x_3} = 0$, so long as $x_3$ doesn't equal zero ... I then substitute the equations i found earlier that are part of the solution set...And then because $v_3$ only needs to be a combination of 1 or more other vector, I've selected that $x_3 = -1$ and $x_4 = x_5 = 0$. And now the equation is in terms of $v_1$ and $v_2$ . Is this correct?
You have to find those co-efficients rather than choosing them.