vector space proofs
I'm taking a linear algebra course and I haven't needed to do proofs before this class (I'm an EE major and we take "special" math for Calc and Diff Eq). I could really use some help on the proofs for vector spaces. I have a quiz on Tuesday and one of these proofs will be on there. I know these would be consider simple but they are giving me a lot of trouble.
1. Let A and B be row equivalent matrices.
(a) Show that the dimension of the column space of A equals the dimension of the column space of B.
(b) Are the column spaces of the two matrices necessarily the same? Justify your answer.
2. Let A be an m x n matrix with rank equal to n. Show that if x ≠ 0, and y=Ax, then y ≠ 0.
3. Let A be an m x n matrix.
(a) If B is a nonsingular m x n matrix, show that BA and A have the same nullspace and hence the same rank.
(b) If C is a nonsingular n x n matrix, show that AC and A have the same rank.
4. Let A ∈ R^(m x n), B ∈ R^(n x r), and C = AB. Show that
(a) The column space of C is a subspace of the column space of A.
(b) The row space of C is a subspace of the row space of B.
(c) rank(C) ≤ min(rank(A), rank(B)).
This is your homework. It was given to you so you could "learn by doing". We will be happy to help and make suggestions but we need to see what you have done in order to help you. You might start by listing the definitions that are relevant. What does it mean for two matrices to be "row equivalent"? What is the "column space" of a matrix?
If I knew how to do these then I wouldn't need to post here. I really have no idea where to start or what to do.
Let me try to get you started. I will try to give you an outline of the proof.
Originally Posted by mdresch31
1.a. Let A' be a matrix which you obtain after doing one-elementary row operation on A. Show that of c1,c2,,,,cn column vectors in A, if ci was independent it still is (in A') and if it was dependent (in A) it still is (in A'). Extend the argument to B. Hence conclude B has same basis as A. Hence same dimension.
1.b. I am not sure what is asked in this question but I guess the answer is 'no'. Think of a counter-example.
2.a. y can be looked as a linear combination of column vectors of A. (check how! - x provides the scalars). Now rank of A = number of independent col vectors. Hence each is independent. What is the only way to get 0 as a linear combination of n-independent vectors?
Will post about others in a while plz.
3. One approach is to use the fact that any invertible matrix is row-equivalent to I. And derive the results from there.
Originally Posted by aman_cc
4. C=AB can be viewed as
- r column vectors each of which is a linear combination of col vectors of A
- m row vectors each of which is a linear combination of row vectors of B
Result follows from this observation.