# Thread: Prove that vj must be the zero vector

1. ## Prove that vj must be the zero vector

Let S = { $v_1, v_2,....v_k$} be an orthogonal set of vectors in $R^n$. If S is linearly dependent, prove that one of the $v_j$ must be the zero vector.

Find an orthonormal basis for the column space of the matrix A =
[2 5 7]
[3 1 8]
[6 6 10]
[0 6 -9] and obtain the QR factorisation of A.

2. Let $n=3$, and let $v_{1}=(1,0,0),\,\, v_{2}=(0,1,0)$. Then $\{v_{1},v_{2}\}\subset \mathbb{R}^{n}$ are orthogonal and neither are zero. Did you mean to assume that $k>n$?

3. That set is not linearly dependent.

wopashui, if the set is linearly dependent, then there exist numbers, $a_i$, not all 0, such that $a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_kv_k= 0$. Now take the dot product of both sides of that with $v_1$, $v_2$, etc.

As for the second problem, think of the three columns of A as three vectors and use "Gram-Schmidt" to find an orthonormal basis.

4. Woops! Note to self: read the question!

5. Originally Posted by nimon
Woops! Note to self: read the question!
Always a good suggestion!

6. Originally Posted by HallsofIvy
That set is not linearly dependent.

wopashui, if the set is linearly dependent, then there exist numbers, $a_i$, not all 0, such that $a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_kv_k= 0$. Now take the dot product of both sides of that with $v_1$, $v_2$, etc.

As for the second problem, think of the three columns of A as three vectors and use "Gram-Schmidt" to find an orthonormal basis.

but none of the vector of A is orthonormal, whhich vector do I start with? And what is the QR factorisation of A?

7. It doesn't matter that the columns aren't orthogonal. The Gram-Schmidt procedure takes any set of basis vectors and turns them into an orthogonal basis of the same set, which can then be normalised to give an orthonormal set.