# linear algebra

• Oct 3rd 2006, 05:40 AM
mooshazz
linear algebra
how do i know that if

A*A^-1=I
then
A^-1*A=I

when A is a matrix
• Oct 3rd 2006, 08:02 AM
ThePerfectHacker
Quote:

Originally Posted by mooshazz
how do i know that if

A*A^-1=I
then
A^-1*A=I

when A is a matrix

You do not.
You need to show that.
Matrix Multiplication is not commutative.

Though it is interesting, if something is left inverse is it right inverse, by square matrices? I do not think so, (In general groups the answer is no, but these are matrices, still I think the answer is no).
• Oct 3rd 2006, 08:11 AM
CaptainBlack
Quote:

Originally Posted by mooshazz
how do i know that if

A*A^-1=I
then
A^-1*A=I

when A is a matrix

Suppose BA=I=AC,

then

B=BI=B(AC)=(BA)C=C

So if both a left and right inverse exist then they are identical.

RonL
• Oct 3rd 2006, 08:15 AM
ThePerfectHacker
Quote:

Originally Posted by CaptainBlack
So if both a left and right inverse exist then they are identical.

Umm... The user is asking if a left inverse exists prove a right inverses exists. Then show uniqueness. Then show equality.
• Oct 3rd 2006, 10:01 AM
CaptainBlack
Quote:

Originally Posted by ThePerfectHacker
Umm... The user is asking if a left inverse exists prove a right inverses exists. Then show uniqueness. Then show equality.

No he is asking to prove that if a left (or right I don't recall which is which)
inverse exists then it is also a right inverse.

I have left it to the reader to show that if one of the inverses exists then
so does the other.

However here is a a nice proof.

That A has a left inverse means that A considered as a function on R^n
(or C^n) whose action is defined by:

A(X)=AX,

then A is one-one and onto from R^n to itself. Then the left inverse B is
a function such that:

B(A(X))=BAX=X

But also B takes X to Y such that A(Y)=X, so

A(B(X)=ABX=X

Hence a left inverse is a right inverse (or have I made some
assumption that is not valid? :confused: )

RonL
• Oct 3rd 2006, 10:08 AM
ThePerfectHacker
Quote:

Originally Posted by CaptainBlack
assumption that is not valid? :confused: )

I was commenting on your first post.
You use the fact the left and right inverses exist.
But he is asking if one exists then so does the other (which is what you shown right now, though I cannot say I understand your proof. One thing I do understand that you use the fact that A is matrix and not just any general binary algebraic structure thus, it that rule need not always be true in general).
• Oct 3rd 2006, 03:42 PM
AfterShock
"Matrix Multiplication is not commutative."

Indeed, although for inverses of matrices, it works out. Matrix multiplication is, however, associative.

(AB)^(-1)*(AB) = [(B^(-1)) * (A^(-1)) *(A) * (B)] = B^(-1) * (I) * (B)

= B^(-1) * (I) * (B) = B^(-1) * B = I

Where I is the identity matrix.
• Oct 3rd 2006, 04:33 PM
ThePerfectHacker
Quote:

Originally Posted by AfterShock
"Matrix Multiplication is not commutative."

Indeed, although for inverses of matrices, it works out. Matrix multiplication is, however, associative.

(AB)^(-1)*(AB) = [(B^(-1)) * (A^(-1)) *(A) * (B)] = B^(-1) * (I) * (B)

= B^(-1) * (I) * (B) = B^(-1) * B = I

Where I is the identity matrix.

Thank you. But that is not what we were discussing.
• Oct 3rd 2006, 05:03 PM
AfterShock
A*A^-1=I
then
A^-1*A=I

It's showing why this holds through an example.

So, I think it is what we are discussing.
• Oct 3rd 2006, 06:59 PM
ThePerfectHacker
Quote:

Originally Posted by AfterShock
A*A^-1=I
then
A^-1*A=I

It's showing why this holds through an example.

So, I think it is what we are discussing.

You did not prove the biconditional,
A*A^-1 if and only if A^-1*A
(CaptainBlank did. I just wish he was more careful in his proofs like me. So I can understand. He uses the style of Laplace I use the style of Lagrange).

You have shown that,
(AB)^{-1}*(AB)=I

Which is not necessary by long expansion.
Simply set C=AB
Then,
C^-1*C=1
Proof complete.

Unless you were trying to show something else which I missed.