Let's say are linearly independent vectors in .
Suppose is a vector in and is not a linear combination of .
Show that are linearly independent.
I have some idea, but I can't think of a rigorous proof. Any ideas?
Let's say are linearly independent vectors in .
Suppose is a vector in and is not a linear combination of .
Show that are linearly independent.
I have some idea, but I can't think of a rigorous proof. Any ideas?
I am also new to the subject. Though what you say is correct in essence a more 'rigorous' argument will be
If is not equal to 0, it has an inverse . If we multiply the entire equation
by we get
Which is not possible - hence is not equal to 0.
(Any comments from other senior members here will be helpful.)
Mu question would be - How do you define (and then prove) independence of a vector with a set span( )?Does this work: since and span( ) are linearly independent, and are linearly independent, thus are linearly independent?
Not that I know of. I feel this one is pretty fine, as we have just relied on axioms of field and vector space. I guess, any other approach should use a similar argument(about the existence of inverse - as I think that is the key axiom used here). Thanks