1. ## Linear combination proof

Q: provided that $\displaystyle S_{0}=\{\vec{u_{1}},...,\vec{u_{n}}\}$ is a linearly independent set of vectors and that the set $\displaystyle S_{1}=\{\vec{u_{1}},...,\vec{u_{n}},\vec{v}\}$ is linearly dependent, prove that $\displaystyle \vec{v}$ is a linear combination of the $\displaystyle \vec{u_{i}}$'s.

A: Since $\displaystyle S_{0}$ is linearly independent the vector equation,

$\displaystyle c_{1}\vec{u_1}+c_{2}\vec{u_2}+. . .+c_{n}\vec{u_n}=\vec{0}$

has only the trival solution, $\displaystyle c_{1}=0, c_{2}=0,. . .,c_{k}=0$. Conversely, the set $\displaystyle S_{1}$ is linearly dependent; thus, there exists a vector in $\displaystyle S_{1}$ that can be written as a linear combination. Since $\displaystyle S_{1}=\{\{S_{0}\},\vec{v}\}$ and $\displaystyle S_{0}$ is known to be linearly independent we can narrow our search to just one vector, the vector $\displaystyle \vec{v}$. So, we have a new vector equation,

$\displaystyle c_{1}\vec{u_1}+c_{2}\vec{u_2}+. . .+c_{n}\vec{u_n}+k\vec{v}=\vec{0}$

where $\displaystyle k\in{\mathbb{R}}$

Solving for $\displaystyle \vec{v}$

$\displaystyle \vec{v}=-\frac{c_{1}}{k}\vec{u_1}-. . .-\frac{c_{n}}{k}\vec{u_n}$

$\displaystyle \therefore$ $\displaystyle \vec{v}$ can be written as linear comination of $\displaystyle \vec{u_{i}}$'s.

I dunno, I feel a chunck is missing. Is that a sufficient proof?

Thanks

2. You need to explain why we know that $\displaystyle t\not=0$.

3. Originally Posted by Plato
You need to explain why we know that $\displaystyle t\not=0$.
are you talking about the constant $\displaystyle k$?

4. Well, $\displaystyle k\neq{0}$, because the set $\displaystyle S_{1}$ is linear dependent and as I tried explaining, every vector in the set is linearly independent except for the vector $\displaystyle \vec{v}$. This implies, when we write our new set as a linear combination of the zero vector, $\displaystyle k$ cannot equal zero, because if $\displaystyle k=0$ we would have only the trivail solution and in turn $\displaystyle S_{1}$ would be linearly independent. So, $\displaystyle S_{1}$ is dependent $\displaystyle \forall{k\neq{0}}\in{\mathbb{R}}$.

Is that the right train of thought?

5. ## right track

I think things look good, you just gotta be more careful with how you say things and set up the proof. You are given that $\displaystyle \{u_1,...,u_n\}$ is L.I. and that when you add that vector v to it $\displaystyle \{u_1,...,u_n, v\}$, it makes it L.D.

So you say since $\displaystyle \{u_1,...,u_n, v\}$ is LD, there exist a nontrivial linear combination such that

$\displaystyle c_1u_1+...+c_nu_n + kv=0$

Suppose for a moment that k=0. Then at least one of those $\displaystyle c_i\not = 0$. But then you have just found a nontrivial linear combination of $\displaystyle \{u_1,...,u_n\}$ which sums to 0, in particular $\displaystyle c_1u_1+...+c_nu_n=0$ which contradicts the linear independence of $\displaystyle \{u_1,...,u_n\}$. Thus, you know $\displaystyle k \not = 0$ and you can proceed using your argument.

Might be worth noting why $\displaystyle \frac{1}{k}$ exists. I don't know whether you are in algebra or linear algebra, but you can have modules which are basically vector spaces but the coefficients come from a ring where multiplicative inverses need not exist. You are in a vector space which is over a field, so it makes sense to divide by k since you have shown it to be a nonzero element of the field which means it is a unit (is invertible).