How do I prove if a set of vectors are independent?

• Apr 4th 2010, 06:58 AM
jayshizwiz
How do I prove if a set of vectors are independent?
I've tried many things but have been unable to answer this question:

$v$ and $u_1,...,u_k$ are vectors in $R^n$. Let $v$ be a linear combination of $u_1,...,u_k$ and have a single solution.

Prove $u_1,...,u_k$ are independent.

Hint: let $u_1,...,u_k$ be dependent vectors.

Thanks!
• Apr 4th 2010, 09:11 AM
nimon
Hi jayshizwiz,

I think you haven't been given a reply until now because some of the parts of your question are hard to follow. You must ask your question so carefully that the reader doesn't have to guess what you mean: what do you mean by "a single solution"? and the hint surely should be: "Suppose that the above holds, but that $u_{1}, \ldots, u_{k}$ are linearly dependent", not merely "Let $u_{1}, \ldots, u_{k}$ be independent", because if we define these vectors to be independent, we cannot contradict ourselves! But if we suppose that they are dependent, then we can contradict the initial hypoethesis.

In this case, the proper statement of the problem should be: suppose that each $v\in\mathbb{R}^{n}$ has a unique representation as a linear multiple of the vectors $u_{1},\ldots,u_{k}.$ Then prove that $u_{1},\ldots,u_{k}$ are linearly independent.

Proof (I will get you started)

Suppose that the statement holds and that $u_{1}, \ldots, u_{k}$ are dependent, and let $v$ be written uniquely as

$v = \lambda_{1}u_{1}+\ldots+\lambda_{k}u_{k}$ (1)

By linear dependence, there exist scalars $\mu_{1},\ldots,\mu_{k}$ such that

$\mu_{1}u_{1}+\ldots+\mu_{k}u_{k} = 0$ (2)

with some $\mu_{j}\neq 0,\quad 1\leq j \leq k.$ Then rearranging (2) gives

$u_{j}=\frac{1}{\mu_{j}}\sum_{i\neq j}{\mu_{i}u_{i}}$

Now try to substitute this representation of $u_{j}$ into the original representation of $v$ given in (1). Is this a different expression for $v$ as a linear combination of $u_{1},\ldots,u_{k}$?
• Apr 5th 2010, 10:00 AM
jayshizwiz
Thanks nimon,

I don't study in English so I'm trying to translate as best as I can.

I still don't know where to continue with this:

$

v = \lambda_{1}u_{1}+\ldots+\lambda_{k}u_{k}
$

and

$

u_{j}=\frac{1}{\mu_{j}}\sum_{i\neq j}{\mu_{i}u_{i}}
$

I don't know which u vector $u_{j}$ belongs to. How do I know to replace it with $u_1$ or with $u_2$?
• Apr 7th 2010, 01:28 AM
nimon
The vector $u_{j}$ we picked was any vector whose coefficient in the solution of (2) is non-zero, and we know that such a $u_{j}$ exists due to linear dependence. This $j$ could be any number between $1$ and $k$, and we don't want to assume that it is $1$ or $2$, we just know that one of them has non-zero coefficient so we let this be $u_{j}$.

Given that $v=\lambda_{1}u_{1}+\ldots+\lambda_{j}u_{j}+\ldots+ \lambda_{k}u_{k}$, we can now replace $u_{j}$ in this expression with $u_{j}=\frac{1}{\mu_{j}}\sum_{i\neq j}{\mu_{i}u_{i}}$ to get:

$v=\mu_{1}u_{1}+\ldots+\lambda_{j}\frac{1}{\mu_{j}} \sum_{i\neq j}{\mu_{i}u_{i}}+\ldots+\mu_{k}u_{k}$ (3)

The notation $\sum\limits_{i\neq j}$ means to sum over all $i=1,\ldots,k$ but not $j$.

Now just try and collect the coefficients in (3) to give $v$ as a linear multiple of $\{u_{1},\ldots,u_{k}\}\backslash \{u_{j}\}$

I hope this is helpful, and sorry for the lateness of my reply. Your English seems very good for someone who doesn't study it!