# Thread: Basis and Dependent Vector - Linear Algebra

1. ## Basis and Dependent Vector - Linear Algebra

Suppose that B is a basis for the vector space V and ~v is not an element of B.
If we make a new collection C by adding ~v to B, that is
C = B U {~v};
show that C must be a dependent set of vectors.

2. Originally Posted by flaming
Suppose that B is a basis for the vector space V and ~v is not an element of B.
If we make a new collection C by adding ~v to B, that is
C = B U {~v};
show that C must be a dependent set of vectors.
Because we can express ~v in terms of linear combinations of B.

3. Just to reword TPH's idea...

So the basis of vector space V is a subset that can represent any vector in V as a linear combination of vectors from B. In order for a set to be linear independent, no linear combination of vectors in that set can produce 0 (other than the trivial case where all the coefficients are 0).

Thus since C contains ~v, not in B, but in V, we can represent ~v as a linear combination of vectors in the basis, which means we could represent -(~v), and the sum of the linear combinations could now be zero non-trivially.

How bad did I mess up TPH?