I hope this is the right subforum: the question stems from this other thread:
Induction of linear independent vectors
Use induction over to prove that for any field elements :
Ok, so there's nothing special about the formula.
ii) but then... ?
I mean... what am I showing here, exactly, ?
Does this work?
I'm a bit confused about the inductive step in this kind of problem. It seems kind of obvious and I guess it's really just an example of the distributive law?
Hmmm... thanks Abstractionist. That's straight-forward enough:
Let be a vector-space over a field ; are linear independent vectors in . Prove by induction that the vectors , are linear independent.
So, there is a set
induction hypothesis is
but there are only vectors in ?
is the set of linearly independent vectors
and trivially linear independent as only vector in space.
So, the question is how to show that the vectors in are linearly independent? It's easy to see, because the matrix formed by n linear equations is a lower triangle, which row reduces to . The subspace is going to have dimension and therefore is linearly independent? But anyway, this has to be shown using induction?
and is linearly independent iff is the only solution to the combination of the vectors .
So, a combination of any of the vectors in is never 0, by definition of their being linearly independent therefore the only combination of the vectors in that give zero is when .
But I need help with the induction.