
Linear Algebra proof
I have shown that if a set of vectors (v1,...,vn) is linearly independant, then every vector v that is an element of <v1,...,vn> has a unique representation as a linear combination of vectors v1,...,vn.
Now I need to show that the converse statement is true, that is if a vector v has a unique representation as a linear combination of vectors v1,...,vn then the set of vectors (v1,...,vn) are linearly independant. Does anyone have suggestions on you show that?

Suppose we can write a vector x as a_1v_1+...+a_nv_n and as b_1v_1+...+b_nv_n. We want to show a_1=b_1,...,a_n=b_n for uniqueness. So that means, a_1v_1+...+a_nv_n = b_1v_1+...+b_nv_n. Subtract them, (a_1b_1)v_1+...+(a_nb_n)v_n = 0. Since these vector are linearly independent it means there is only the trivial representation. And so that means, a_1  b_1 = 0 ... a_n  b_n =0 so a_1 = b_1 ... a_n=b_n. Q.E.D.

That is how I showed the first part  that there is a unique representation. I don't think that works to show the converse that if there is a unique representation, then the set of vectors (v1,...,vn) are independant.

Correct not everything can be expressed in linearly independent vectors.

A nice google search helped me find the answer to my question:
Elements of Operator Theory  Google Book Search
Thanks for your help, though! :)

What is so hard? Consider $\displaystyle \{ (0,1,0),(1,0,0) \}$ this is linearly independent but $\displaystyle (0,0,1)$ cannot be obtained as a linear combination of those two elements.

just try to understand the proof.. Ü
Suppose v has a unique representation of linear combinations of vectors $\displaystyle v_1, v_2, ... ,v_n$, i.e.
$\displaystyle v = \sum_{i=1}^n a_iv_i = a_1v_1 + a_2v_2 + ... + a_nv_n$
and suppose the $\displaystyle S:= \{ v_1, v_2, ..., v_n \}$ is linearly dependent.
then, $\displaystyle \exists v_j \in S$ which is a linear combination of the other vectors in S (assuming you have proven this already) i.e.
$\displaystyle v_j =  (b_j^{1}b_1v_1 + b_j^{1}b_2v_2 + ... + b_j^{1}b_{j1}v_{j1} + b_j^{1}b_{j+1}v_{j+1} + ... + b_j^{1}b_nv_n$ > (*)
from
$\displaystyle v = \sum_{i=1}^n a_iv_i = a_1v_1 + a_2v_2 + ... + a_jv_j + ... + a_nv_n$, we replace $\displaystyle v_j$ by (*), and since V is an abelian group, we can rearrange and combine like terms, so that,
$\displaystyle v = \sum_{i=1}^n a_iv_i = (a_1  a_jb_j^{1}b_1)v_1 + (a_2  a_jb_j^{1}b_2)v_2 + ... + (a_{j1}  a_jb_j^{1}b_{j1})v_{j1} +$
$\displaystyle (a_{j+1}  a_jb_j^{1}b_{j+1})v_{j+1} + ... + (a_n  a_jb_j^{1}b_n)v_n$
and clearly, $\displaystyle a_k  a_jb_j^{1}b_k = a_k \Leftrightarrow a_jb_j^{1}b_k = 0$, for k=1,2,...,n
but we have to note that S must not contain the zero vector since v has a unique representation of linear comb of vectors in S. hence, $\displaystyle a_k  a_jb_j^{1}b_k \neq a_k$ for all k.
which contradicts now that v has a unique representation of linear comb of the vectors in S.
therefore, S must be lin. indep set. QED