# Math Help - Relationship between linear dependence and parallel vectors

1. ## Relationship between linear dependence and parallel vectors

I just need some verification. Does linear dependence, imply that the vectors are parallel?

And vice versa, if the vectors are parallel, does that imply that they are linearly dependent.

2. It's best not to think "parallel", since all vectors are considered to have their base at the origin. If you have a set of two nonzero vectors {a,b}, then the following are equivalent:

1. a and be are linearly dependent,
2. a=cb for some scalar c.

If you have more than two vectors, then you can have linear dependence without any pair being linearly dependent among themselves. For instance, the set of vectors {(1,0), (0,1), (1,1)} is linearly dependent in R^2, although none of them is a scalar multiple of any other.

3. Originally Posted by Tinyboss
It's best not to think "parallel", since all vectors are considered to have their base at the origin. If you have a set of two nonzero vectors {a,b}, then the following are equivalent:

1. a and be are linearly dependent,
2. a=cb for some scalar c.

If you have more than two vectors, then you can have linear dependence without any pair being linearly dependent among themselves. For instance, the set of vectors {(1,0), (0,1), (1,1)} is linearly dependent in R^2, although none of them is a scalar multiple of any other.
Once of the vectors (1,1) can be written as a linear combination of (1,0) and (0,1) so isn't that vector kind of redundant.

4. Originally Posted by acevipa
Once of the vectors (1,1) can be written as a linear combination of (1,0) and (0,1) so isn't that vector kind of redundant.
The point was to show that linear dependence can occur without any two vectors being "parallel".