# Math Help - Proofs using column vectors and invertible matrix

1. ## Proofs using column vectors and invertible matrix

Here it is:

Let A be an n x n matrix. Denote it's columns by v1, v2, ... vn. Keep in mind that these are column vectors in Rn.

a. Prove that A is invertible if and only if {v1, v2, ... vn} is linearly independent. We are supposed to use the Invertible Matrix Theorem here.

This is what I have so far:
Let A be invertible. We want to show {v1, v2, ... vn} is linearly independent. {v1, v2, ... vn} is the standard basis for Rn. Let c be in the reals. Then set c1v1 + c2v2 + ... + cnvn = the zero vector. Then
[c1] = the zero vector = [0]
[c2] [0]
[cn] [0]
Thus, {v1, v2, ... vn} is linearly independent.

I'm not sure if this is right or how to complete the second half of the proof. Any help would be appreciated!

2. ## Re: Proofs using column vectors and invertible matrix

Originally Posted by widenerl194
Here it is:

Let A be an n x n matrix. Denote it's columns by v1, v2, ... vn. Keep in mind that these are column vectors in Rn.

a. Prove that A is invertible if and only if {v1, v2, ... vn} is linearly independent. We are supposed to use the Invertible Matrix Theorem here.
Okay, what is the "Invertible Matrix Theorem"?

This is what I have so far:
Let A be invertible. We want to show {v1, v2, ... vn} is linearly independent. {v1, v2, ... vn} is the standard basis for Rn.
This is confusing, you said above that v1, v2, ...., vn were the columns of A. Now you are saying they are the standard basis for Rn.
Which do you want?

Let c be in the reals. Then set c1v1 + c2v2 + ... + cnvn = the zero vector. Then
[c1] = the zero vector = [0]
[c2] [0]
[cn] [0]
Thus, {v1, v2, ... vn} is linearly independent.

I'm not sure if this is right or how to complete the second half of the proof. Any help would be appreciated!
Then you say "Let c be in the reals" but talk about c1, c2, ..., cn. Did you mean "ci" rather than just i?

And what do you mean by "[c1] = the zero vector = [0]"? What does the "[ ]" indicate?

3. ## Re: Proofs using column vectors and invertible matrix

The invertible matrix theorem says:

Let A be an n x n matrix, I the n x n identity matrix, and θ be the zero column vector in R. The following are equivalent:
1. A is invertible
2. The reduced row echelon form of A is I
3. For any b in R^n, the equation Ax=b has exactly one solution
4. The equation Ax=θ has only x=θ as a solution

The v1 v2 ... vn should be the columns