Let's say are linearly independent vectors in .

Suppose is a vector in and is not a linear combination of .

Show that are linearly independent.

I have some idea, but I can't think of a rigorous proof. Any ideas?

- Oct 4th 2009, 09:49 AMowqProof: Addition of vector makes the set remain linearly independent
Let's say are linearly independent vectors in .

Suppose is a vector in and is not a linear combination of .

Show that are linearly independent.

I have some idea, but I can't think of a rigorous proof. Any ideas? - Oct 4th 2009, 10:03 AMaman_cc
- Oct 4th 2009, 10:31 AMowq
Thanks.

Can we say that since not equals , the equation will never be equal to zero unless ?

Does this work: since and span( ) are linearly independent, and are linearly independent, thus are linearly independent?

- Oct 4th 2009, 10:53 AMaman_cc
I am also new to the subject. Though what you say is correct in essence a more 'rigorous' argument will be

If is not equal to 0, it has an inverse . If we multiply the entire equation

by we get

Which is not possible - hence is not equal to 0.

(Any comments from other senior members here will be helpful.)

Quote:

Does this work: since and span( ) are linearly independent, and are linearly independent, thus are linearly independent?

- Oct 5th 2009, 12:53 AMowq
If is not equal to 0, it has an inverse .

If we multiply the entire equation by we get

which is not possible.

Hence .

I see. Is there any other way to solve this? - Oct 5th 2009, 01:10 AMaman_cc
Not that I know of. I feel this one is pretty fine, as we have just relied on axioms of field and vector space. I guess, any other approach should use a similar argument(about the existence of inverse - as I think that is the key axiom used here). Thanks