Greetings,

This question is a some kind of repost of this topic.

Not so easy question (at least I hope) about matrix inverse

Most beginner-level texts on linear algebra take this for granted and define invertible matrix A as a matrix for which exists another matrix A' such that AA'=I=A'A. Because this definition is given very early in the topic, almost all proofs of properties of invertible matrices rely on it, so they cannot be used to prove it (creating something of a chicken and egg problem). I found some texts that note this situation and mention that they are not ready to prove this fact at this stage of exposition, but could not find at glance if they prove this fact at all.

I found these lecture notes:

http://www.lehigh.edu/~gi02/m242/08m242lrinv.pdf

They approach the problem intuitively by trying to extract analogies from Gauss-Jordan elimination and apply it to

factorization of a matrix.

I didn't like the proof enough. Although it has logic, it seems to me that it lacks on rigor and elegance.

For example, I could not devise more specific proof that we can first permute rows of a matrix once then we can add top rows to lower rows to create

upper triangular matrix. I think this is needed to prove that any square matrix has factorization. I can explain it in words, but I could not devise "symbolic" proof that it is really so.

So the QUESTION:

Is there any more elegant approach, where I can read about it and if not - where I can read more rigorous application of the above approach. If I'm not mistaken, factorization is attributed to Alan Turing in the 1950s, which probably means that this elementary fact in question should be easy proven by other means (determinants, something ...) as the way they did it before that.