# Orthogonal complement

• Dec 25th 2011, 03:02 PM
Ulysses
Orthogonal complement
I have this problem, I think I've solved the first part, anyway, I'd like you to see it, because I'm not sure if I've proceeded right.

The problem says: Let $X=\mathbb{R}^2$. Find $M^{\perp{}}$ if:
a) $M=\{x\}$, where $x=(\xi_1,\xi_2)\neq 0$
b) A linearly independent set $\{x_1,x_2\}\subset{M}$

So, basically what I did in a) was stating: $M^{\perp{}}=\{z\in{X}|=0\forall{y\in{Y}}\}$

Then $y=(\eta_1,\eta_2)$

Therefore $=0 \rightarrow \xi_1 \eta_1+\xi_2 \eta_2=0 \rightarrow \frac{\xi_1}{\xi_2}=\frac{-\eta_2}{\eta_1}$

And then $y=(\xi_2,-\xi_1)$.

Is this right in the first place?

And for b) I've stated $M=\{(x,y)|x=\alpha_1 x+\alpha_2 y \neq 0 \forall \alpha_1,\alpha_2\in \mathbb{R} \}$, here x and y are vectors in X.

I know there's no another possible linearly independent vector in X, so there are no others orthogonal complements I think. But I don't know how to proceed from here.

Bye, and thank you for your help, which is always useful.
• Dec 26th 2011, 12:27 AM
FernandoRevilla
Re: Orthogonal complement
Quote:

Originally Posted by Ulysses
Therefore $=0 \rightarrow \xi_1 \eta_1+\xi_2 \eta_2=0 \rightarrow \frac{\xi_1}{\xi_2}=\frac{-\eta_2}{\eta_1}$ . And then $y=(\xi_2,-\xi_1)$.

We don't know if $\xi_2\neq 0$ . Better: $M^{\perp}=\{(\eta_1,\eta_2)\in\mathbb{R}^2:\eta_1 \xi_1+\eta_2\xi_2=0\}$ , then $\dim M^{\perp}=\dim \mathbb{R}^2-\textrm{rank}\;(\xi_1,\xi_2)=1$ and $(\xi_2,-\xi_1)$ is a non zero vector of $M^{\perp}$ . This means that $B=\{(\xi_2,-\xi_1)\}$ is a basis of $M^{\perp}$ and as a consequence $M^{\perp}=\textrm{Span}[(\xi_2,-\xi_1)]$

Quote:

b) A linearly independent set $\{x_1,x_2\}\subset{M}$
$B=\{x_1,x_2\}$ is a basis of $\mathbb{R}^2$ , so $M=\mathbb{R}^2$ and as a consequence $M^{\perp}=\{0\}$ .
• Dec 26th 2011, 12:28 AM
Opalg
Re: Orthogonal complement
Don't forget that an orthogonal complement is always a linear subspace. In (a), for example, $M^\perp$ cannot consist of a single vector. It will have to be the one-dimensional subspace consisting of all scalar multiples of that vector.

Your answer to (b) is essentially on the right lines, except that there is one vector that lies in any linear subspace. So the answer here will be the (zero-dimensional) subspace consisting of that one vector.