Finding a basis for an orthogonal

• May 18th 2008, 06:01 PM
pakman
Finding a basis for an orthogonal
Let $S$ be the subspace of $R^{4}$ spanned by $x_1=(1, 0, -2, 1)^{T}$ and $x_2=(0, 1, 3, -2)^{T}$. Find a basis for $S^{\bot}$.

I was thinking about multiplying the vectors by a set of vectors $[x_1,x_2,x_3,x_4]^T$ $\epsilon$ $S^{\bot}$ but then since there are two vectors I am lost on where to begin. Help would be appreciated!
• May 18th 2008, 07:11 PM
mr fantastic
Quote:

Originally Posted by pakman
Let $S$ be the subspace of $R^{4}$ spanned by $x_1=(1, 0, -2, 1)^{T}$ and $x_2=(0, 1, 3, -2)^{T}$. Find a basis for $S^{\bot}$.

I was thinking about multiplying the vectors by a set of vectors $[x_1,x_2,x_3,x_4]^T$ $\epsilon$ $S^{\bot}$ but then since there are two vectors I am lost on where to begin. Help would be appreciated!

Can you explain what the notation $S^{\bot}$ means?
• May 18th 2008, 07:34 PM
pakman
Quote:

Originally Posted by mr fantastic
Can you explain what the notation $S^{\bot}$ means?

$S^{\bot}$ is the orthogonal component of $S$

From my book...

$Y^{\bot}=x \; \epsilon \; R^{n}\mid\;x^{\top}y=0$ for every $y\;\epsilon\;Y$
• May 18th 2008, 08:12 PM
NonCommAlg
Quote:

Originally Posted by pakman
Let $S$ be the subspace of $R^{4}$ spanned by $x_1=(1, 0, -2, 1)^{T}$ and $x_2=(0, 1, 3, -2)^{T}$. Find a basis for $S^{\bot}$.

I was thinking about multiplying the vectors by a set of vectors $[x_1,x_2,x_3,x_4]^T$ $\epsilon$ $S^{\bot}$ but then since there are two vectors I am lost on where to begin. Help would be appreciated!

the general method is to put your vectors in a matrix: $A=\begin{pmatrix}1 & 0 & -2 & 1 \\ 0 & 1 & 3 & -2 \end{pmatrix}\ .$ then a basis for the set of the solutions

$A \bold{x}=0$ is a basis for $S^{\bot}.$ let $\bold{x}=(x_1 \ x_2 \ x_3 \ x_4)^T,$ and $A \bold{x}=0.$ then $x_1-2x_3+x_4=x_2+3x_3-2x_4=0.$ therefore:

$x_3=2x_1+x_2, \ x_4=3x_1+2x_2.$ hence $\bold{x}=x_1(1 \ \ 0 \ \ 2 \ \ 3)^T + x_2(0 \ \ 1 \ \ 1 \ \ 2)^T.$ so a basis for $S^{\bot}$ is:

$\{(1 \ \ 0 \ \ 2 \ \ 3)^T, (0 \ \ 1 \ \ 1 \ \ 2)^T \}. \ \ \ \ \square$
• May 18th 2008, 08:42 PM
pakman
Quote:

Originally Posted by NonCommAlg
the general method is to put your vectors in a matrix: $A=\begin{pmatrix}1 & 0 & -2 & 1 \\ 0 & 1 & 3 & -2 \end{pmatrix}\ .$ then a basis for the set of the solutions

$A \bold{x}=0$ is a basis for $S^{\bot}.$ let $\bold{x}=(x_1 \ x_2 \ x_3 \ x_4)^T,$ and $A \bold{x}=0.$ then $x_1-2x_3+x_4=x_2+3x_3-2x_4=0.$ therefore:

$x_3=2x_1+x_2, \ x_4=3x_1+2x_2.$ hence $\bold{x}=x_1(1 \ \ 0 \ \ 2 \ \ 3)^T + x_2(0 \ \ 1 \ \ 1 \ \ 2)^T.$ so a basis for $S^{\bot}$ is:

$\{(1 \ \ 0 \ \ 2 \ \ 3)^T, (0 \ \ 1 \ \ 1 \ \ 2)^T \}. \ \ \ \ \square$

Did you take the transpose of the $x_1$ and $x_2$ to get that $A$?
• May 18th 2008, 10:46 PM
NonCommAlg
Quote:

Originally Posted by pakman
Did you take the transpose of the $x_1$ and $x_2$ to get that $A$?

yes! your vectors make the rows of $A.$