Results 1 to 13 of 13

Math Help - Right and Left Square matrix inverses

  1. #1
    Newbie
    Joined
    Nov 2010
    Posts
    6

    Right and Left Square matrix inverses

    Hi guys,

    I would like to intuitively understand why the following is true:
    Let A be a square, invertible matrix. let C be A's left inverse, and B A's right inverse.
    Then it follows: B=C

    I am aware of the simple proof:
    C=C(AB)=CAB=(CA)B=B

    yet, this gives away no intuition about the logic behind this claim.
    The left inverse is acting on A's columns, the right one is acting on A's rows.
    This seems almost like magic.

    How would you explain this in child's terms?

    Thanks.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,792
    Thanks
    1532
    I cannot imagine anything being simpler than that! The "intuition" about the logic is precisely the meaning of "inverse"- it has nothing to do with "columns" or "rows".
    Follow Math Help Forum on Facebook and Google+

  3. #3
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    I believe some authors define the inverse such that it commutes with the original matrix. In that case, it is superfluous to talk about right and left inverses, because they are, by definition, the same thing.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Newbie
    Joined
    Nov 2010
    Posts
    6
    I'm going by the natural definition of left or right inverse. in square matrices they are the same and only then can you say "inverse" without specifying left or right.

    Sure we can look at this in a high level manner, as a general algebraic structure satisfying associativity and existence of unit element. But then we'd be ignoring the beautiful properties of matrices.
    AB means a series of linear transformations on columns of A, according to B.
    if AB=I, one cannot immediately see why BA = I as well. BA meaning a series of linear transformations on rows of A according to B, which is, so to say, a completely different operation.
    The simple proof above, simply does not answer my question. at least not directly.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Tell you what: why don't you give us a list of assumptions (axioms) that you both believe and understand, and that are relevant to the question at hand. List also any other theorems you buy into. Then state the theorem you don't understand, and then perhaps we could produce a constructive proof of the theorem that might enable you to understand.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Nov 2010
    Posts
    6
    Given only the definition of matrix multiplication. Why is the following true for square matrices:
    If AB=I then BA=I
    Follow Math Help Forum on Facebook and Google+

  7. #7
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Hmm. That could be challenging, although it seems intuitive that you should be able to prove the result from the one assumption. I must admit that I don't know off-hand how to do it, but maybe some ramblings may stimulate some thinking for you.

    So the usual definition of matrix multiplication for square matrices is as follows. Let A,B be square, n\times n matrices. The product matrix C=AB is the matrix consisting of entries as follows:

    \displaystyle C_{ij}=\sum_{k=1}^{n}A_{ik}B_{kj}.

    Let us assume that AB=I. We wish to show that BA=I.

    Now, define the Kronecker delta symbol \delta_{ij} as follows:

    \delta_{ij}=\begin{cases}1\quad i=j\\ 0\quad i\not=j\end{cases}.

    It is standard to show that

    I_{ij}=\delta_{ij}. Thus, by assumption, we have that

    \displaystyle \sum_{k=1}^{n}A_{ik}B_{kj}=\delta_{ij}.

    Now, we know that B^{T}A^{T}=(AB)^{T}=I^{T}=I. This you can prove using the definition of matrix multiplication. How would that look in summation notation?
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by leolol View Post
    Given only the definition of matrix multiplication. Why is the following true for square matrices:
    If AB=I then BA=I
    Try lifting A to a linear trnasformation.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by Drexel28 View Post
    Try lifting A to a linear trnasformation.
    Or consider that if AB=I then 1\det\left(AB\right)=\det\left(A\right)\det\left(B  \right) and so \det\left(A\right),\det\left(B\right)\ne 0 and so A,B are invertible.

    Thus, we note that AB=I\implies B=A^{-1} and so B^{-1}=\left(A^{-1}\right)^{-1}=A and so I=BA.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    I don't think we are allowed to use determinants.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by Ackbeet View Post
    I don't think we are allowed to use determinants.
    Then, consider the fact that we can view the matrices A,B as endomorphisms, for simplicities sake from \mathbb{R}^n to itself, then note that the existence of a left inverse for B implies that B is injective, thus B\left(\mathbb{R}^n\right) is n-dimensional, and so from an elementary theorem it follows that B(\mathbb{R}^n}=\mathbb{R}^n and so B is surjective, thus bijective and so B is invertible. Similarly, since A possesses a right inverse we know that A is surjective and it's easy to show that this is impossible if A is not injective and thus A is also invertible.

    The rest of my proof stands.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    I'm not sure I followed all that, so you tell me: does this proof satisfy the conditions of post # 6?
    Follow Math Help Forum on Facebook and Google+

  13. #13
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by Ackbeet View Post
    I'm not sure I followed all that, so you tell me: does this proof satisfy the conditions of post # 6?
    No, it does not. But this is really a statement about endomorphisms of finite dimensional spaces whether the OP would like to admit it or not. Even if there was a purely matrix analytic proof it would obscure the reason for it.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Matrix inverses
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: October 28th 2010, 03:08 PM
  2. left and right inverses of functions
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: September 29th 2010, 09:07 PM
  3. Matrix Inverses.
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: January 9th 2010, 09:00 AM
  4. Matrix Inverses, help plz
    Posted in the Advanced Algebra Forum
    Replies: 10
    Last Post: October 20th 2008, 04:52 AM
  5. Replies: 1
    Last Post: June 15th 2008, 12:11 AM

Search Tags


/mathhelpforum @mathhelpforum