You need to explain why we know that .
Q: provided that is a linearly independent set of vectors and that the set is linearly dependent, prove that is a linear combination of the 's.
A: Since is linearly independent the vector equation,
has only the trival solution, . Conversely, the set is linearly dependent; thus, there exists a vector in that can be written as a linear combination. Since and is known to be linearly independent we can narrow our search to just one vector, the vector . So, we have a new vector equation,
can be written as linear comination of 's.
I dunno, I feel a chunck is missing. Is that a sufficient proof?
Well, , because the set is linear dependent and as I tried explaining, every vector in the set is linearly independent except for the vector . This implies, when we write our new set as a linear combination of the zero vector, cannot equal zero, because if we would have only the trivail solution and in turn would be linearly independent. So, is dependent .
Is that the right train of thought?
I think things look good, you just gotta be more careful with how you say things and set up the proof. You are given that is L.I. and that when you add that vector v to it , it makes it L.D.
So you say since is LD, there exist a nontrivial linear combination such that
Suppose for a moment that k=0. Then at least one of those . But then you have just found a nontrivial linear combination of which sums to 0, in particular which contradicts the linear independence of . Thus, you know and you can proceed using your argument.
Might be worth noting why exists. I don't know whether you are in algebra or linear algebra, but you can have modules which are basically vector spaces but the coefficients come from a ring where multiplicative inverses need not exist. You are in a vector space which is over a field, so it makes sense to divide by k since you have shown it to be a nonzero element of the field which means it is a unit (is invertible).