Prove it by induction on r. Start with r=0. Then any rows form a matrix whose rank is 0. Since , it is trivially true that .
Next, make the induction hypothesis and prove the induction step.
We prove this by induction. First we show this hold for . If , then any rows of form a matrix whose rank is 0. So . So this holds for .
Now we assume , and show that it holds for .
implies that any rows of form a matrix whose rank is . Therefore we have .
So we need to show that .
Is this on the right track? I am not sure where to go from here?
Actually, induction might not be the best way to prove this. Perhaps a direct proof would be more efficient. If you have rows and the rank is , that tells you that at least rows are nonzero. Suppose the rest are all zero rows. Then there are zero rows. So, if , and you just happen to choose all zero rows, then the rank of the matrix you chose would be , which is true. If , then by the pigeonhole principle, you must have chosen at least independent rows, so the rank is a minimum of .
You can use the fact that if a row is added to a matrix, its rank cannot increase by more than 1. So, if the rank of the chosen s rows is < r + s - m = r - (m - s), then adding the rest (m - s) rows will not restore the rank of the matrix to r.