# linear independence and the zero vector

• Jun 27th 2008, 07:13 PM
squarerootof2
linear independence and the zero vector
how would i prove that for a vector space V, a subset of V that contains the zero vector is linearly dependent?
• Jun 27th 2008, 07:51 PM
o_O
Isn't any vector space containing the zero vector linearly dependent or am I misreading this?

Consider the nonempty set $\displaystyle S = \{\bold{v}_{1}, \bold{v}_{2}, \hdots, \bold{v}_{n}, \bold{0}\}$. From here, we can see that: $\displaystyle 0\bold{v}_{1} + 0\bold{v}_{2} + \hdots + 0\bold{v}_{n} + 1(\bold{0}) = \bold{0}$

i.e. 0 is a linear combination of the vectors in S whose coefficients aren't all 0.
• Jun 27th 2008, 07:59 PM
squarerootof2
thanks, just noticed after posting.
• Jun 27th 2008, 08:41 PM
mr fantastic
Quote:

Originally Posted by o_O
Isn't any vector space containing the zero vector linearly dependent or am I misreading this?

Consider the nonempty set $\displaystyle S = \{\bold{v}_{1}, \bold{v}_{2}, \hdots, \bold{v}_{n}, \bold{0}\}$. From here, we can see that: $\displaystyle 0\bold{v}_{1} + 0\bold{v}_{2} + \hdots + 0\bold{v}_{n} + 1(\bold{0}) = \bold{0}$

i.e. 0 is a linear combination of the vectors in S whose coefficients aren't all 0.

I have a problem with that definition of linear dependence and this is the ideal thread to get some opinions.

The definition of linear dependent I like to use is that any vector S can be constructed as a linear combination of the others.

Consider the set S = {0, i, j). Linearly dependent or not? I say not since you can't construct i as a linear combination of 0 and j .....

(I do understand the argument when the other definition is used that S is dependent ..... I just don't like it ......)
• Jun 27th 2008, 11:14 PM
squarerootof2
isn't that the definition of span? linearly independent means that a_1*v_1+...+a_n*v_n=0 for scalars a_i and vectors v_i in the vector spaces foces all the scalars to equal 0. i don't see where the trouble is.
• Jun 27th 2008, 11:20 PM
mr fantastic
Quote:

Originally Posted by squarerootof2
isn't that the definition of span? linearly independent means that a_1*v_1+...+a_n*v_n=0 for scalars a_i and vectors v_i in the vector spaces foces all the scalars to equal 0. i don't see where the trouble is.

I miswrote my question. Since edited.
• Jun 28th 2008, 02:14 PM
squarerootof2
from my understanding, ANY vector that contains the zero vector cannot be linearly independent (by earlier argument), thus linearly dependent. isn't that true?
• Jun 28th 2008, 02:37 PM
Plato
Quote:

Originally Posted by squarerootof2
from my understanding, ANY vector that contains the zero vector cannot be linearly independent (by earlier argument), thus linearly dependent. isn't that true?

ANY subset of vectors that contains the zero vector cannot be linearly independent.
That is the correct statement.
• Jun 28th 2008, 06:40 PM
o_O
Why does it have to be a subset that contains the zero vector? I thought it was any set of vector that contained it was linearly dependent .. well until mr. fantastic's example came up with his definition of linear independence.
• Jun 28th 2008, 06:58 PM
mr fantastic
Quote:

Originally Posted by Plato
ANY subset of vectors that contains the zero vector cannot be linearly independent.
That is the correct statement.

I am familiar with this statement. But if a set S of vectors is linearly dependent then you should be able to express each element of S as a linear combination of the others ..... For S = {0, i, j), how do you express i (or j) as a linear combination of the others?
• Jun 28th 2008, 09:47 PM
Isomorphism
Quote:

Originally Posted by mr fantastic
But if a set S of vectors is linearly dependent then you should be able to express each element of S as a linear combination of the others .....

Hmmm..... What about $\displaystyle S = \{(1,2),(2,4),(1,1)\}$?
This set is clearly linearly dependent. But I cant express each element of S as a linear combination of the others. For example, (1,1) cannot be expressed as a linear combination of the others, yet S is linearly dependent.

Considering your knowledge, perhaps you meant:
Quote:

Originally Posted by Mr.F meant
But if a set S of vectors is linearly dependent then you should be able to express some element of S as a linear combination of the others .....

Technically this statement is wrong too, unless you define it that way. A set S is defined to be linearly dependent if it is not linearly independent. So technically(formally, logically...) you have to negate the definition of linear independence to get the mathematical formulation for linear dependence. And that formulation is the one you dont like....

I think this dilemma of implication and equivalence is pretty common. Generally if you have a set of vectors S where one is able to express some element of S as a linear combination of the others, then the set is definitely linearly dependent. However the other way round is not necessarily true. And your question works as wonderful illustration to this fact.

These are my thoughts and it could be unconvincing. Perhaps some MHF Algebraist can explain it better.
• Jun 29th 2008, 12:09 AM
mr fantastic
Quote:

Originally Posted by Isomorphism
Hmmm..... What about $\displaystyle S = \{(1,2),(2,4),(1,1)\}$?
This set is clearly linearly dependent. But I cant express each element of S as a linear combination of the others. For example, (1,1) cannot be expressed as a linear combination of the others, yet S is linearly dependent.

Considering your knowledge, perhaps you meant:

Technically this statement is wrong too, unless you define it that way. A set S is defined to be linearly dependent if it is not linearly independent. So technically(formally, logically...) you have to negate the definition of linear independence to get the mathematical formulation for linear dependence. And that formulation is the one you dont like....

I think this dilemma of implication and equivalence is pretty common. Generally if you have a set of vectors S where one is able to express some element of S as a linear combination of the others, then the set is definitely linearly dependent. However the other way round is not necessarily true. And your question works as wonderful illustration to this fact.

These are my thoughts and it could be unconvincing. Perhaps some MHF Algebraist can explain it better.

Linear Dependence Theorem: The set {$\displaystyle v_1, \, v_2, \, .... v_n$} of non-zero vectors is linearly dependent if and only if some $\displaystyle x_k$, $\displaystyle 2 \leq k \leq n$, is a linear combination of the preceding ones.
Theorem: A set of two or more vectors $\displaystyle S =${$\displaystyle v_1, \, v_2, \, .... v_n$} is linearly dependent if and only if one of the $\displaystyle v_i$ is a linear combination of the other vectors in S.