# Prove that vj must be the zero vector

• Apr 6th 2010, 11:55 AM
wopashui
Prove that vj must be the zero vector
Let S = {$\displaystyle v_1, v_2,....v_k$} be an orthogonal set of vectors in $\displaystyle R^n$. If S is linearly dependent, prove that one of the $\displaystyle v_j$ must be the zero vector.

Find an orthonormal basis for the column space of the matrix A =
[2 5 7]
[3 1 8]
[6 6 10]
[0 6 -9] and obtain the QR factorisation of A.
• Apr 7th 2010, 01:42 AM
nimon
Let $\displaystyle n=3$, and let $\displaystyle v_{1}=(1,0,0),\,\, v_{2}=(0,1,0)$. Then $\displaystyle \{v_{1},v_{2}\}\subset \mathbb{R}^{n}$ are orthogonal and neither are zero. Did you mean to assume that $\displaystyle k>n$?
• Apr 7th 2010, 03:42 AM
HallsofIvy
That set is not linearly dependent.

wopashui, if the set is linearly dependent, then there exist numbers, $\displaystyle a_i$, not all 0, such that $\displaystyle a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_kv_k= 0$. Now take the dot product of both sides of that with $\displaystyle v_1$, $\displaystyle v_2$, etc.

As for the second problem, think of the three columns of A as three vectors and use "Gram-Schmidt" to find an orthonormal basis.
• Apr 7th 2010, 07:08 AM
nimon
Woops! Note to self: read the question!
• Apr 7th 2010, 12:57 PM
HallsofIvy
Quote:

Originally Posted by nimon
Woops! Note to self: read the question!

Always a good suggestion!
• Apr 13th 2010, 11:15 AM
wopashui
Quote:

Originally Posted by HallsofIvy
That set is not linearly dependent.

wopashui, if the set is linearly dependent, then there exist numbers, $\displaystyle a_i$, not all 0, such that $\displaystyle a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_kv_k= 0$. Now take the dot product of both sides of that with $\displaystyle v_1$, $\displaystyle v_2$, etc.

As for the second problem, think of the three columns of A as three vectors and use "Gram-Schmidt" to find an orthonormal basis.

but none of the vector of A is orthonormal, whhich vector do I start with? And what is the QR factorisation of A?
• Apr 13th 2010, 11:48 AM
nimon
It doesn't matter that the columns aren't orthogonal. The Gram-Schmidt procedure takes any set of basis vectors and turns them into an orthogonal basis of the same set, which can then be normalised to give an orthonormal set.