How could we show that a matrix has both a right and a left inverse if and only if the matrix is square matrix, and its rank equals to its order? (We don’t know the concept of determinant, yet.)
suppose A is an nxn matrix, with rank(A) = n. then its columns are linearly independent. so if {e1,e2,...,en} is the standard basis for the domain of A (F^n, probably R^n in your case),
then {A(e1),A(e2),...,A(en)} is also a basis for F^n. let P be the matrix which sends A(ej) --> ej (a change-of-basis matrix). then:
AP(v) = AP(a1A(e1) + a2A(e2) +...+ anA(en)) = A(P(a1A(e1) + a2A(e2) +...+ anA(en)))
= A(a1P(A(e1)) + a2P(A(e2)) +...+ anP(A(en))) = A(a1e1 + a2e2 +...+anen)
= a1A(e1) + a2A(e2) +...+ anA(en) = v = I(v), for all v, so P is a right-inverse for A. also:
PA(v) = PA(b1e1 + b2e2 +...+ bnen) = P(A(b1e1 + b2e2 + ...+ bnen))
= P(b1A(e1) + b2A(e2) +...+ bnA(en)) = b1P(A(e1)) + b2P(A(e2)) +...+ bnP(a(en))
= b1e1 + b2e2 +...+ bnen = v = I(v), for all v, so P is a left-inverse for A.
*******
now, suppose that A has an inverse which is two-sided, so AB = BA = I.
if A is mxn, then B must be both an mxm and an nxn matrix, so m=n, thus A is square.
note that v = I(v) = A(B(v)), so v is in col(A), for any v, so dim(col(A)) = rank(A) = n.