Thread: Proof that square matrix has left inverse if and only if it has right inverse

1. Proof that square matrix has left inverse if and only if it has right inverse

Greetings,

This question is a some kind of repost of this topic.
Not so easy question (at least I hope) about matrix inverse

Most beginner-level texts on linear algebra take this for granted and define invertible matrix A as a matrix for which exists another matrix A' such that AA'=I=A'A. Because this definition is given very early in the topic, almost all proofs of properties of invertible matrices rely on it, so they cannot be used to prove it (creating something of a chicken and egg problem). I found some texts that note this situation and mention that they are not ready to prove this fact at this stage of exposition, but could not find at glance if they prove this fact at all.
I found these lecture notes:

http://www.lehigh.edu/~gi02/m242/08m242lrinv.pdf

They approach the problem intuitively by trying to extract analogies from Gauss-Jordan elimination and apply it to
$P^{T}LU$ factorization of a matrix.
I didn't like the proof enough. Although it has logic, it seems to me that it lacks on rigor and elegance.
For example, I could not devise more specific proof that we can first permute rows of a matrix once then we can add top rows to lower rows to create
upper triangular matrix. I think this is needed to prove that any square matrix has $P^{T}LU$ factorization. I can explain it in words, but I could not devise "symbolic" proof that it is really so.

So the QUESTION:

Is there any more elegant approach, where I can read about it and if not - where I can read more rigorous application of the above approach. If I'm not mistaken, $P^{T}LU$ factorization is attributed to Alan Turing in the 1950s, which probably means that this elementary fact in question should be easy proven by other means (determinants, something ...) as the way they did it before that.

2. Re: Proof that square matrix has left inverse if and only if it has right inverse

Let A be an invertible matrix. Then A^-1 exists. Assume that A has a left inverse. Then A^-1 A = In. We need to show that A has a right inverse i.e. A A^-1 = In. Let B = A A^-1. We need to show that B = In. Multiply both sides of B = A A^-1 by A on the right. Then BA = (A A^-1) A = A (A^-1 A) = A In = A. Then multiply both sides on the right by A^-1. Then (BA)A^-1 = AA^-1. Then B(A A^-1) = A A^-1. Then BB = B. Then B^2 = B. Since A is invertible, det(A) =/= 0. Then det(A^-1) = 1/det(A) =/= 0, A^-1 is invertible. Then det (B) = det(A A^-1) = det(A) * det(A^-1) = det(A) * 1/det(A) = 1 =/= 0, so B is invertible. Then B^-1 exists and B has a left inverse. Then multiply both sides of B^2 = B by B^-1 on the left. Then B^-1 (B^2) = B^-1 B. Then (B^-1 B) B = B^-1 B. Then In B = In. Then B = In. Thus, A A^-1 = In. Thus, A has a right inverse.

3. Re: Proof that square matrix has left inverse if and only if it has right inverse

Mathguy25, I'll just rewrite Your proof with TEX tags.

-------------------------------------------------------
Assume $A^{-1}A=I$ and consider $B=AA^{-1}$ Multiply both sides of $B=AA^{-1}$ by A on the right.
Then $BA=(AA^{-1})A=A(A^{-1}A)=A$. Then multiply both sides on the right by $A^{-1}$.
Then $(BA)A^{-1}=AA^{-1}$ which is the same as $BB=B$.
Now we come to the interesting part of the proof:

Since A is invertible (has left inverse) then $det(A^{-1})=1/det(A)$ both are not equal to zero so $A^{-1}$ is invertible.

If we knew, that $A^{-1}$ is left-invertible too, we could prove that the set of all left invertible matrices is closed under matrix
multiplication and use weak left-side group axioms to prove they are indeed, a group.

Definition:Group Axioms/Left - ProofWiki

Which would prove automatically that $AA^{-1}=I=A^{-1}A$.

Now the question:
How do we prove:
(1) A matrix has left inverse if and only if its determinant is non-zero. Proving this is easy if we had some powerful theorem like
what is called in my book Fundamental Theorem Of Invertible Matrices. Only problem is that this theorem is proven in the text I
read by using the fact $AA^{-1}=I=A^{-1}A$ as given...

So is there a proof of the claim (1), which only uses definition of determinants (Laplace Expansion, something like that)?

,

,

,

,

a square matrix a is inverse if and only if

Click on a term to search for related topics.