# Thread: Orthogonalization proof

1. ## Orthogonalization proof

Hello,
the task is following. Let's assume that $S=\{y_1,y_2,y_3\}$ is orthogonal vector set of subspace $V$ and $\vec{0} \notin S$. Also $x \in V$. It's defined that:
$y=x-\sum_{i=1}^{3}\frac{(x, y_i)}{(y_i, y_i)}y_i$

So, the problem is to show that $y$ is orthogonal to all vectors of $S$.

When the sum is opened, we get:
$y=x-\frac{(x, y_1)}{(y_1, y_1)}y_1-\frac{(x, y_2)}{(y_2, y_2)}y_2-\frac{(x, y_3)}{(y_3, y_3)}y_3$

It seems like formula from Gram-Schmidt process, so in my opinion it must be $y \perp x, \forall x \in S$, because G-S process orthogonalizes vectors. I think I need a little bit more evidence to show the proposition, but I don't know how to continue. Any help is welcome. Thanks!

2. If $V$ is a subspace, you absolutely must have $\vec{0}\in V.$ Are you sure you meant $\vec{0}\not\in V?$

Also, did you mean that $S$ is an orthogonal basis for $V?$

3. Originally Posted by Ackbeet
If $V$ is a subspace, you absolutely must have $\vec{0}\in V.$ Are you sure you meant $\vec{0}\not\in V?$
Oops... I meant $\vec{0}\not\in S$. Does that make sense?

Originally Posted by Ackbeet
Also, did you mean that $S$ is an orthogonal basis for $V?$
Yeah, that's what I'm trying to say.

4. Ok, so $\vec{0}\not\in S,$ but you have to be able to get $\vec{0}$ from a linear combination of vectors in $S,$ right?

Moving on to the proof, why not compute

$(y_{k},y)?$

I think you'll only need to do one - the other two are analogous. What should you get as the result of this computation?

5. Originally Posted by Ackbeet
$(y_{k},y)?$

I think you'll only need to do one - the other two are analogous. What should you get as the result of this computation?
I think I should get $\vec{0}$, because they must be orthogonal. I know how the dot product is defined, but in this case I'm puzzled, because I don't know how many elements there's in $y_{k}$ etc.

6. Well, you'll get the scalar 0, since the result of any dot product is a scalar, right? But yes, you should get zero if they are orthogonal. You can't compute the dot products directly, but you can use various theorems you know about dot products, such as:

$(x,\alpha y)=\alpha(x,y),$ for all scalars $\alpha$ and vectors $x$ and $y.$

$(x,y+z)=(x,y)+(x,z)$ for all vectors $x, y, z.$

You also know that $S$ is an orthogonal set (all the vectors in it are mutually orthogonal). That tells you something about, say, $(y_{1},y_{2}),$ right?

Just dive in and see what happens.

7. Got it now with using your help and proof of Gram-Schmidt process. Thanks very much!

8. You're welcome. Have a good one!