# Thread: How do I prove if a set of vectors are independent?

1. ## How do I prove if a set of vectors are independent?

I've tried many things but have been unable to answer this question:

$\displaystyle v$ and $\displaystyle u_1,...,u_k$ are vectors in $\displaystyle R^n$. Let $\displaystyle v$ be a linear combination of $\displaystyle u_1,...,u_k$ and have a single solution.

Prove $\displaystyle u_1,...,u_k$ are independent.

Hint: let $\displaystyle u_1,...,u_k$ be dependent vectors.

Thanks!

2. Hi jayshizwiz,

I think you haven't been given a reply until now because some of the parts of your question are hard to follow. You must ask your question so carefully that the reader doesn't have to guess what you mean: what do you mean by "a single solution"? and the hint surely should be: "Suppose that the above holds, but that $\displaystyle u_{1}, \ldots, u_{k}$ are linearly dependent", not merely "Let $\displaystyle u_{1}, \ldots, u_{k}$ be independent", because if we define these vectors to be independent, we cannot contradict ourselves! But if we suppose that they are dependent, then we can contradict the initial hypoethesis.

In this case, the proper statement of the problem should be: suppose that each $\displaystyle v\in\mathbb{R}^{n}$ has a unique representation as a linear multiple of the vectors $\displaystyle u_{1},\ldots,u_{k}.$ Then prove that $\displaystyle u_{1},\ldots,u_{k}$ are linearly independent.

Proof (I will get you started)

Suppose that the statement holds and that $\displaystyle u_{1}, \ldots, u_{k}$ are dependent, and let $\displaystyle v$ be written uniquely as

$\displaystyle v = \lambda_{1}u_{1}+\ldots+\lambda_{k}u_{k}$ (1)

By linear dependence, there exist scalars $\displaystyle \mu_{1},\ldots,\mu_{k}$ such that

$\displaystyle \mu_{1}u_{1}+\ldots+\mu_{k}u_{k} = 0$ (2)

with some $\displaystyle \mu_{j}\neq 0,\quad 1\leq j \leq k.$ Then rearranging (2) gives

$\displaystyle u_{j}=\frac{1}{\mu_{j}}\sum_{i\neq j}{\mu_{i}u_{i}}$

Now try to substitute this representation of $\displaystyle u_{j}$ into the original representation of $\displaystyle v$ given in (1). Is this a different expression for $\displaystyle v$ as a linear combination of $\displaystyle u_{1},\ldots,u_{k}$?

3. Thanks nimon,

I don't study in English so I'm trying to translate as best as I can.

I still don't know where to continue with this:

$\displaystyle v = \lambda_{1}u_{1}+\ldots+\lambda_{k}u_{k}$

and

$\displaystyle u_{j}=\frac{1}{\mu_{j}}\sum_{i\neq j}{\mu_{i}u_{i}}$

I don't know which u vector $\displaystyle u_{j}$ belongs to. How do I know to replace it with $\displaystyle u_1$ or with $\displaystyle u_2$?

4. The vector $\displaystyle u_{j}$ we picked was any vector whose coefficient in the solution of (2) is non-zero, and we know that such a $\displaystyle u_{j}$ exists due to linear dependence. This $\displaystyle j$ could be any number between $\displaystyle 1$ and $\displaystyle k$, and we don't want to assume that it is $\displaystyle 1$ or $\displaystyle 2$, we just know that one of them has non-zero coefficient so we let this be $\displaystyle u_{j}$.

Given that $\displaystyle v=\lambda_{1}u_{1}+\ldots+\lambda_{j}u_{j}+\ldots+ \lambda_{k}u_{k}$, we can now replace $\displaystyle u_{j}$ in this expression with $\displaystyle u_{j}=\frac{1}{\mu_{j}}\sum_{i\neq j}{\mu_{i}u_{i}}$ to get:

$\displaystyle v=\mu_{1}u_{1}+\ldots+\lambda_{j}\frac{1}{\mu_{j}} \sum_{i\neq j}{\mu_{i}u_{i}}+\ldots+\mu_{k}u_{k}$ (3)

The notation $\displaystyle \sum\limits_{i\neq j}$ means to sum over all $\displaystyle i=1,\ldots,k$ but not $\displaystyle j$.

Now just try and collect the coefficients in (3) to give $\displaystyle v$ as a linear multiple of $\displaystyle \{u_{1},\ldots,u_{k}\}\backslash \{u_{j}\}$

I hope this is helpful, and sorry for the lateness of my reply. Your English seems very good for someone who doesn't study it!