Suppose that B is a basis for the vector space V and ~v is not an element of B.

If we make a new collection C by adding ~v to B, that is

C = B U {~v};

show that C must be a dependent set of vectors.

Printable View

- Sep 23rd 2008, 08:38 AMflamingBasis and Dependent Vector - Linear Algebra
Suppose that B is a basis for the vector space V and ~v is not an element of B.

If we make a new collection C by adding ~v to B, that is

C = B U {~v};

show that C must be a dependent set of vectors. - Sep 23rd 2008, 10:04 AMThePerfectHacker
- Sep 23rd 2008, 10:53 AMJameson
Just to reword TPH's idea...

So the basis of vector space V is a subset that can represent any vector in V as a linear combination of vectors from B. In order for a set to be linear independent, no linear combination of vectors in that set can produce 0 (other than the trivial case where all the coefficients are 0).

Thus since C contains ~v, not in B, but in V, we can represent ~v as a linear combination of vectors in the basis, which means we could represent -(~v), and the sum of the linear combinations could now be zero non-trivially.

How bad did I mess up TPH?