# Span of a subset

• Apr 4th 2013, 04:08 PM
widenerl194
Span of a subset
I just need to know how to start this... I'm looking through my notes but can't figure out how to begin. The wording is also a little bit confusing.

Find a finite subset of each of the following subspaces for which the subspace is the span of the subset.

{x bar is an element of R3 such that A(x bar) = theta bar} where A=
1 1 2 1 -1 and theta bar is the zero vector in R3.
2 2 1 2 -1
3 3 2 3 -1
• Apr 4th 2013, 05:09 PM
Gusbob
Re: Span of a subset
In this problem, the subspace in question is the nullspace of Ax=0. It is an elementary linear algebra problem to find the parametric solution for the system Ax=0. The vectors you get there will span the nullspace.
• Apr 4th 2013, 06:21 PM
widenerl194
Re: Span of a subset
okay, what I did was the following:

Let the matrix
a
b
c be an element of U. Then the matrix:
d
e

1 1 2 1 -1 a 0
2 2 1 2 -1 X b = 0
3 3 2 3 -1 c 0
d
e

Then put that in an augmented matrix and row reduced to get

1 1 0 1 0
0 0 1 0 0
0 0 0 0 1

I got that e=0 and c=0 but now I don't know where to go or even if I'm doing this the right way
• Apr 4th 2013, 06:46 PM
Gusbob
Re: Span of a subset
Quote:

Then put that in an augmented matrix and row reduced to get

1 1 0 1 0
0 0 1 0 0
0 0 0 0 1
This translates to the equations $a+b+d=0$ and $c=e=0$ as you have noted. Then

$\bar{x}=\begin{bmatrix} a \\ b \\c \\ d \\ e \end{bmatrix}=\begin{bmatrix} -b-d \\ b \\0 \\ d \\0 \end{bmatrix}=b\begin{bmatrix} -1 \\ 1 \\0 \\ 0 \\0 \end{bmatrix}+d\begin{bmatrix} -1 \\ 0 \\0 \\ 1 \\0 \end{bmatrix}$

with $b,d$ free. Since every $\bar{x}$ is of this form, it follows that the two vectors I wrote down above span the nullspace of A.
• Apr 4th 2013, 08:12 PM
widenerl194
Re: Span of a subset
Okay, this make so much sense now. But where did you get those two vectors from?
• Apr 4th 2013, 08:15 PM
widenerl194
Re: Span of a subset
Nevermind I figured it out! Thanks so much!!
• Apr 4th 2013, 08:16 PM
Gusbob
Re: Span of a subset
$\begin{bmatrix} -b-d \\ b \\0 \\ d \\0 \end{bmatrix}=\begin{bmatrix} -b \\ b \\0 \\ 0 \\0 \end{bmatrix}+\begin{bmatrix} -d \\ 0 \\0 \\ d \\0 \end{bmatrix}=b\begin{bmatrix} -1 \\ 1 \\0 \\ 0 \\0 \end{bmatrix}+d\begin{bmatrix} -1 \\ 0 \\0 \\ 1 \\0 \end{bmatrix}$.

Does this make more sense?