# Span and linearly independence question

• Feb 24th 2011, 04:23 PM
kensington
Span and linearly independence question
Suppose that {u; v} is a linearly independent set of vectors and w is not in the span of {u; v}. Show that the set {u, v,w} is linearly independent.

So we no that a linearly independent set:
x1v1+x2v2+.......xpvp=0 where x has only the trivial soltuion
So does that mean i have to plug in u and w into that equation to show it will equal 0?
Im confused on how to show it when w is not in the span. Any Help would be appriciated
• Feb 24th 2011, 06:13 PM
tonio
Quote:

Originally Posted by kensington
Suppose that {u; v} is a linearly independent set of vectors and w is not in the span of {u; v}. Show that the set {u, v,w} is linearly independent.

So we no that a linearly independent set:
x1v1+x2v2+.......xpvp=0 where x has only the trivial soltuion
So does that mean i have to plug in u and w into that equation to show it will equal 0?
Im confused on how to show it when w is not in the span. Any Help would be appriciated

Supose $\displaystyle au+bv+cw=0\,,\,\,a,b,c\mbox{ scalars }\Longrightarrow cw = -au-bw$ . If $\displaystyle c\neq 0$ then

dividing by $\displaystyle c$ we'd get that w is a lin. comb. of u,v, so it must be $\displaystyle c=0\Longrightarrow au+bv=0\Longrightarrow \mbox{ also } a=b=0\Longrightarrow$ we're done.

Tonio
• Feb 25th 2011, 07:02 AM
Raoh
in a different way,
Let $\displaystyle a,b,c$ be scalars such that $\displaystyle au+bv+cw=0$
if we had $\displaystyle c\neq 0$ we would have $\displaystyle w=-\frac{1}{c}\left ( au+bv \right )$ which implies $\displaystyle w\in \texttt{Span}(v,u)$ and that's not true by assumption.Therefore $\displaystyle c=0$.As a result, we go back to $\displaystyle au+bv=0$.And.. we're done.