1. ## Orthogonalization proof

Hello,
the task is following. Let's assume that $\displaystyle S=\{y_1,y_2,y_3\}$ is orthogonal vector set of subspace $\displaystyle V$ and $\displaystyle \vec{0} \notin S$. Also $\displaystyle x \in V$. It's defined that:
$\displaystyle y=x-\sum_{i=1}^{3}\frac{(x, y_i)}{(y_i, y_i)}y_i$

So, the problem is to show that $\displaystyle y$ is orthogonal to all vectors of $\displaystyle S$.

When the sum is opened, we get:
$\displaystyle y=x-\frac{(x, y_1)}{(y_1, y_1)}y_1-\frac{(x, y_2)}{(y_2, y_2)}y_2-\frac{(x, y_3)}{(y_3, y_3)}y_3$

It seems like formula from Gram-Schmidt process, so in my opinion it must be $\displaystyle y \perp x, \forall x \in S$, because G-S process orthogonalizes vectors. I think I need a little bit more evidence to show the proposition, but I don't know how to continue. Any help is welcome. Thanks!

2. If $\displaystyle V$ is a subspace, you absolutely must have $\displaystyle \vec{0}\in V.$ Are you sure you meant $\displaystyle \vec{0}\not\in V?$

Also, did you mean that $\displaystyle S$ is an orthogonal basis for $\displaystyle V?$

3. Originally Posted by Ackbeet
If $\displaystyle V$ is a subspace, you absolutely must have $\displaystyle \vec{0}\in V.$ Are you sure you meant $\displaystyle \vec{0}\not\in V?$
Oops... I meant $\displaystyle \vec{0}\not\in S$. Does that make sense?

Originally Posted by Ackbeet
Also, did you mean that $\displaystyle S$ is an orthogonal basis for $\displaystyle V?$
Yeah, that's what I'm trying to say.

4. Ok, so $\displaystyle \vec{0}\not\in S,$ but you have to be able to get $\displaystyle \vec{0}$ from a linear combination of vectors in $\displaystyle S,$ right?

Moving on to the proof, why not compute

$\displaystyle (y_{k},y)?$

I think you'll only need to do one - the other two are analogous. What should you get as the result of this computation?

5. Originally Posted by Ackbeet
$\displaystyle (y_{k},y)?$

I think you'll only need to do one - the other two are analogous. What should you get as the result of this computation?
I think I should get $\displaystyle \vec{0}$, because they must be orthogonal. I know how the dot product is defined, but in this case I'm puzzled, because I don't know how many elements there's in $\displaystyle y_{k}$ etc.

6. Well, you'll get the scalar 0, since the result of any dot product is a scalar, right? But yes, you should get zero if they are orthogonal. You can't compute the dot products directly, but you can use various theorems you know about dot products, such as:

$\displaystyle (x,\alpha y)=\alpha(x,y),$ for all scalars $\displaystyle \alpha$ and vectors $\displaystyle x$ and $\displaystyle y.$

$\displaystyle (x,y+z)=(x,y)+(x,z)$ for all vectors $\displaystyle x, y, z.$

You also know that $\displaystyle S$ is an orthogonal set (all the vectors in it are mutually orthogonal). That tells you something about, say, $\displaystyle (y_{1},y_{2}),$ right?

Just dive in and see what happens.

7. Got it now with using your help and proof of Gram-Schmidt process. Thanks very much!

8. You're welcome. Have a good one!