Re: Proof involving matrices

You should know that when you can only multiply matrices when the number of columns in the first is equal to the number of rows in the second, and the resulting product matrix will have the "remaining" dimensions as its order. So that means $\displaystyle \displaystyle \begin{align*} \mathbf{C}_{n \times m} \cdot \mathbf{A}_{m \times n} = \left( \mathbf{CA} \right)_{n \times n} \end{align*}$.

Anyway, the idea will be to show that two matrices are equal to the same thing, and are therefore equal to each other. Please note that where I'm using the inverse matrix notation, it means either a right-inverse or a left-inverse, i.e. some matrix which will multiply to give the identity without the original matrix necessarily being square. Working on the first product we have

$\displaystyle \displaystyle \begin{align*} \mathbf{C}_{n \times m} \cdot \mathbf{A}_{m \times n} &= \mathbf{I}_n \\ \mathbf{C}^{-1}_{m \times n} \cdot \mathbf{C}_{n \times m} \cdot \mathbf{A}_{m \times n} &= \mathbf{C}^{-1}_{m \times n} \cdot \mathbf{I}_n \\ \mathbf{I}_m \cdot \mathbf{A}_{m \times n} &= \mathbf{C}^{-1}_{m \times n} \\ \mathbf{A}_{m \times n} &= \mathbf{C}^{-1}_{m \times n} \end{align*}$

Working on the second product we have

$\displaystyle \displaystyle \begin{align*} \mathbf{A}_{m \times n} \cdot \mathbf{D}_{n \times m} &= \mathbf{I}_m \\ \mathbf{A}_{m \times n} \cdot \mathbf{D}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} &= \mathbf{I}_m \cdot \mathbf{D}^{-1}_{m \times n} \\ \mathbf{A}_{m \times n} \cdot \mathbf{I}_n &= \mathbf{D}^{-1}_{m \times n} \\ \mathbf{A}_{m \times n} &= \mathbf{D}^{-1}_{m \times n} \end{align*}$

So it should be clear therefore that $\displaystyle \displaystyle \begin{align*} \mathbf{C}^{-1}_{m \times n} = \mathbf{D}^{-1}_{m \times n} \end{align*}$, and so we get

$\displaystyle \displaystyle \begin{align*} \mathbf{C}_{n \times m} \cdot \mathbf{C}^{-1}_{m \times n} &= \mathbf{C}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} \\ \mathbf{I}_n &= \mathbf{C}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} \\ \mathbf{I}_n \cdot \mathbf{D}_{ n \times m} &= \mathbf{C}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} \cdot \mathbf{D}_{n \times m} \\ \mathbf{D}_{n \times m} &= \mathbf{C}_{n \times m} \cdot \mathbf{I}_m \\ \mathbf{D}_{n \times m} &= \mathbf{C}_{n \times m} \end{align*}$

Re: Proof involving matrices

I would look at this more "abstractly". A is a linear transformation that maps $\displaystyle R^m$ to $\displaystyle R^n$. C is a linear transformation that maps $\displaystyle R^n$ to $\displaystyle R^m$ and D is a linear transformation that maps $\displaystyle R^m$ to $\displaystyle R^n$

Now, suppose n> m. Then A maps $\displaystyle R^m$ into an m dimensional **subspace** of $\displaystyle R^n$. But that means that there exist vector, v, say, in $\displaystyle R^n$ such that $\displaystyle Au\ne v$ for any u in $\displaystyle R^m$. Look at ADv for that v. Do you see that ADv **cannot** be equal to v?

Re: Proof involving matrices

Quote:

Originally Posted by

**Nora314** **Suppose A is a m x n matrix and there exist n x m matrices C and D such that CA = I**_{n} and AD = I_{m}. Prove that m = n and C = D.

This is a comment on both replies.

Note that $\displaystyle A$ is a $\displaystyle m\times n$ matrix. **Each** of $\displaystyle C~\&~D$ is a $\displaystyle n\times m$ matrix.

Usually, as linear transformations, $\displaystyle A:R^n\to R^m,~C:R^m\to R^n,~\&~D:R^m\to R^n$.

Unless we know that $\displaystyle n=m$ we don't think about the inverse of a matrix. There is a notion of *right-inverse* and *left-inverse*. But I don't that is called for here.

You are given that $\displaystyle CA=I_n~\&~DA=I_m$ thus

$\displaystyle C=CI_m=C(AD)=(CA)D=I_nD=D$.

How do we get $\displaystyle m=n~?$

Re: Proof involving matrices

Quote:

Originally Posted by

**Prove It** You should know that when you can only multiply matrices when the number of columns in the first is equal to the number of rows in the second, and the resulting product matrix will have the "remaining" dimensions as its order. So that means $\displaystyle \displaystyle \begin{align*} \mathbf{C}_{n \times m} \cdot \mathbf{A}_{m \times n} = \left( \mathbf{CA} \right)_{n \times n} \end{align*}$.

Anyway, the idea will be to show that two matrices are equal to the same thing, and are therefore equal to each other. Please note that where I'm using the inverse matrix notation, it means either a right-inverse or a left-inverse, i.e. some matrix which will multiply to give the identity without the original matrix necessarily being square. Working on the first product we have

$\displaystyle \displaystyle \begin{align*} \mathbf{C}_{n \times m} \cdot \mathbf{A}_{m \times n} &= \mathbf{I}_n \\ \mathbf{C}^{-1}_{m \times n} \cdot \mathbf{C}_{n \times m} \cdot \mathbf{A}_{m \times n} &= \mathbf{C}^{-1}_{m \times n} \cdot \mathbf{I}_n \\ \mathbf{I}_m \cdot \mathbf{A}_{m \times n} &= \mathbf{C}^{-1}_{m \times n} \\ \mathbf{A}_{m \times n} &= \mathbf{C}^{-1}_{m \times n} \end{align*}$

Working on the second product we have

$\displaystyle \displaystyle \begin{align*} \mathbf{A}_{m \times n} \cdot \mathbf{D}_{n \times m} &= \mathbf{I}_m \\ \mathbf{A}_{m \times n} \cdot \mathbf{D}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} &= \mathbf{I}_m \cdot \mathbf{D}^{-1}_{m \times n} \\ \mathbf{A}_{m \times n} \cdot \mathbf{I}_n &= \mathbf{D}^{-1}_{m \times n} \\ \mathbf{A}_{m \times n} &= \mathbf{D}^{-1}_{m \times n} \end{align*}$

So it should be clear therefore that $\displaystyle \displaystyle \begin{align*} \mathbf{C}^{-1}_{m \times n} = \mathbf{D}^{-1}_{m \times n} \end{align*}$, and so we get

$\displaystyle \displaystyle \begin{align*} \mathbf{C}_{n \times m} \cdot \mathbf{C}^{-1}_{m \times n} &= \mathbf{C}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} \\ \mathbf{I}_n &= \mathbf{C}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} \\ \mathbf{I}_n \cdot \mathbf{D}_{ n \times m} &= \mathbf{C}_{n \times m} \cdot \mathbf{D}^{-1}_{m \times n} \cdot \mathbf{D}_{n \times m} \\ \mathbf{D}_{n \times m} &= \mathbf{C}_{n \times m} \cdot \mathbf{I}_m \\ \mathbf{D}_{n \times m} &= \mathbf{C}_{n \times m} \end{align*}$

Wow, thank you so much for the reply :) this seemed like a nice way to solve the problem!

Re: Proof involving matrices

Quote:

Originally Posted by

**HallsofIvy** I would look at this more "abstractly". A is a linear transformation that maps $\displaystyle R^m$ to $\displaystyle R^n$. C is a linear transformation that maps $\displaystyle R^n$ to $\displaystyle R^m$ and D is a linear transformation that maps $\displaystyle R^m$ to $\displaystyle R^n$

Now, suppose n> m. Then A maps $\displaystyle R^m$ into an m dimensional **subspace** of $\displaystyle R^n$. But that means that there exist vector, v, say, in $\displaystyle R^n$ such that $\displaystyle Au\ne v$ for any u in $\displaystyle R^m$. Look at ADv for that v. Do you see that ADv **cannot** be equal to v?

Thank you for the reply!

I think I understand why ADv cannot be equal to v, from what you wrote. It seems to me like ADv cannot be equal to v, because A maps R^{m} into a subspace, like you said, so that means that it does not cover the whole R^{n}, it is not "onto"? Therefore there will be a vector v that is left "out", and ADv will never be able to be equal to that vector?

Sorry, I think I am just failing to understand why that proves that C = D?

Re: Proof involving matrices

Quote:

Originally Posted by

**Plato** This is a comment on both replies.

Note that $\displaystyle A$ is a $\displaystyle m\times n$ matrix. **Each** of $\displaystyle C~\&~D$ is a $\displaystyle n\times m$ matrix.

Usually, as linear transformations, $\displaystyle A:R^n\to R^m,~C:R^m\to R^n,~\&~D:R^m\to R^n$.

Unless we know that $\displaystyle n=m$ we don't think about the inverse of a matrix. There is a notion of *right-inverse* and *left-inverse*. But I don't that is called for here.

You are given that $\displaystyle CA=I_n~\&~DA=I_m$ thus

$\displaystyle C=CI_m=C(AD)=(CA)D=I_nD=D$.

How do we get $\displaystyle m=n~?$

Thank you so so much for the help!

Ok, I understood what you did there :)

Let me try to do m = n. By trying to do it in a similar way I came up with this:

I_{m} = CC^{-1} = C(AI_{n}) = CA(I_{n}) = I_{n}(I_{n}) = I_{n}

I_{m} = I_{n}

Hope that is correct =D I have another question, if you don't mind, you said that A maps R_{n} into R_{m}, I thought that it mapped R_{m} to R_{n}, since a has m rows and n columns?

Re: Proof involving matrices

Quote:

Originally Posted by

**Nora314** Let me try to do m = n. By trying to do it in a similar way I came up with this:

I_{m} = CC^{-1} = C(AI_{n}) = CA(I_{n}) = I_{n}(I_{n}) = I_{n}

I_{m} = I_{n}

Hope that is correct =D I have another question, if you don't mind, you said that A maps R_{n} into R_{m}, I thought that it mapped R_{m} to R_{n}, since a has m rows and n columns?

How do you know that $\displaystyle C^{-1}$ exists?

It must be a square matrix. That is what you are trying to prove.

Re: Proof involving matrices

Quote:

Originally Posted by

**Nora314** if you don't mind, you said that A maps R_{n} into R_{m}, I thought that it mapped R_{m} to R_{n}, since a has m rows and n columns?

I have waited until now to answer you above question.

Look at this webpage. I did use the word usually.

Most authors, **but not all**, use the convention of writing the transformation as:

$\displaystyle T\cdot v$ where $\displaystyle T$ is an $\displaystyle m\times n$ matrix and $\displaystyle v$ is a $\displaystyle n \times 1$ column vector yielding a $\displaystyle m \times 1$ column vector.

Hence, it maps $\displaystyle R^n\to R^m$.

Again, be warned this is not an absolute notation.