# Thread: Basis and Determinant Proof

1. ## Basis and Determinant Proof

Let β={u1, u2, ... , un} be a subset of F^n containing n distinct vectors and let B be an nxn matrix in F having uj as column j.
Prove that β is a basis for Fn if and only if det(B)≠0.

For one direction of the proof I discussed this with a peer:

Since β consists of n vectors, β is a basis if and only if these vectors are linearly independent, which is equivalent to the map L_B being one-to-one. Since the matrix B is square, this is in turn equivalent to B being invertible, hence having a nonzero determinant.

However I do not understand the transition from the vectors being linearly independent to being one to one. Why is this true? Also, how do I prove the reverse direction?

2. ## Re: Basis and Determinant Proof

Originally Posted by rubixcircle
Let β={u1, u2, ... , un} be a subset of F^n containing n distinct vectors and let B be an nxn matrix in F having uj as column j.
Prove that β is a basis for Fn if and only if det(B)≠0.

For one direction of the proof I discussed this with a peer:

Since β consists of n vectors, β is a basis if and only if these vectors are linearly independent, which is equivalent to the map L_B being one-to-one. Since the matrix B is square, this is in turn equivalent to B being invertible, hence having a nonzero determinant.

However I do not understand the transition from the vectors being linearly independent to being one to one. Why is this true? Also, how do I prove the reverse direction?

I think the vectors are linearly independent and then the map or matrix formed with them is 1-1 (two different things), because there is a theorem that states that a linear map is 1-1 iff it maps basis to basis (I think this should be in finite dimensions).
Nemesis