# Thread: Identity map between bases

1. ## Identity map between bases

My book (Tensor Geometry - Poston & Dodson) says the following:

If $\displaystyle \beta = (b_1,..., b_n)$ is a basis for X, and $\displaystyle A : X \rightarrow Y$ is an isomorphism, then $\displaystyle A\beta = (Ab_1,..., Ab_n)$ is a basis for Y.
If $\displaystyle \beta$ is a basis for X and $\displaystyle A : X \rightarrow X$ is an isomorphism, the change of basis matrix $\displaystyle [I]_\beta^{A\beta}$ is exactly the matrix $\displaystyle ([A]_\beta^\beta)^{\leftarrow}$.
I just can't seem to agree with this result!

After hours of tearing my hair out I have come up with the following argument...Please point out where I've gone wrong...

For some basis $\displaystyle \beta$, some vector $\displaystyle \mathbf{x}$ and its representation $\displaystyle x^\beta$ in the $\displaystyle \beta$ coords.

$\displaystyle \beta x^\beta=\mathbf{x}$
$\displaystyle \Rightarrow x^\beta=\beta^{\leftarrow}\mathbf{x}$
$\displaystyle \Rightarrow [I]_\beta^{\beta'} x^\beta=[I]_\beta^{\beta'} \beta^{\leftarrow}\mathbf{x}$

for some other basis $\displaystyle \beta'$ where $\displaystyle [I]_\beta^{\beta'}$ is the change of basis matrix from $\displaystyle \beta$ to $\displaystyle \beta'$ coordinates.
so
$\displaystyle [I]_\beta^{\beta'} x^\beta=[I]_\beta^{\beta'} \beta^{\leftarrow}\mathbf{x}= x^{\beta'}$ --(*)

We also know the coordinates of $\displaystyle \mathbf{x}$ in the $\displaystyle \beta'$ coords using the $\displaystyle \beta'$ basis:

$\displaystyle \beta' x^{\beta'}=\mathbf{x}$
$\displaystyle \Rightarrow x^{\beta'}=\beta'^{\leftarrow}\mathbf{x}$ --(**)

(*) and (**) combine to give

$\displaystyle [I]_\beta^{\beta'} \beta^{\leftarrow}\mathbf{x}=\beta'^{\leftarrow}\m athbf{x}$
$\displaystyle \Rightarrow [I]_\beta^{\beta'} \beta^{\leftarrow}=\beta'^{\leftarrow}$
$\displaystyle \Rightarrow [I]_\beta^{\beta'}=\beta'^{\leftarrow}\beta$

This seems like a nice neat result to me, but if $\displaystyle \beta'=A\beta$ as it is in the book, we have

$\displaystyle [I]_\beta^{\beta'}=\beta'^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{A \beta}=(A\beta)^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{A \beta}=\beta^{\leftarrow}A^{\leftarrow}\beta$
$\displaystyle \not=A^{\leftarrow}$

However, if $\displaystyle \beta'=\beta A$
$\displaystyle [I]_\beta^{\beta A}=\beta'^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{\beta A}=(\beta A)^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{\beta A}=A^{\leftarrow}\beta^{\leftarrow}\beta$
$\displaystyle =A^{\leftarrow}$

which is the required result.....

I have tried some basic examples with actual numbers and the results support what I have here... Unless I have some fundamental misunderstanding of all this and what it is supposed to mean, which is quite possible...

2. Originally Posted by Mmmm
My book (Tensor Geometry - Poston & Dodson) says the following:

I just can't seem to agree with this result!

After hours of tearing my hair out I have come up with the following argument...Please point out where I've gone wrong...

For some basis $\displaystyle \beta$, some vector $\displaystyle \mathbf{x}$ and its representation $\displaystyle x^\beta$ in the $\displaystyle \beta$ coords.

$\displaystyle \beta x^\beta=\mathbf{x}$
$\displaystyle \Rightarrow x^\beta=\beta^{\leftarrow}\mathbf{x}$
$\displaystyle \Rightarrow [I]_\beta^{\beta'} x^\beta=[I]_\beta^{\beta'} \beta^{\leftarrow}\mathbf{x}$

for some other basis $\displaystyle \beta'$ where $\displaystyle [I]_\beta^{\beta'}$ is the change of basis matrix from $\displaystyle \beta$ to $\displaystyle \beta'$ coordinates.
so
$\displaystyle [I]_\beta^{\beta'} x^\beta=[I]_\beta^{\beta'} \beta^{\leftarrow}\mathbf{x}= x^{\beta'}$ --(*)

We also know the coordinates of $\displaystyle \mathbf{x}$ in the $\displaystyle \beta'$ coords using the $\displaystyle \beta'$ basis:

$\displaystyle \beta' x^{\beta'}=\mathbf{x}$
$\displaystyle \Rightarrow x^{\beta'}=\beta'^{\leftarrow}\mathbf{x}$ --(**)

(*) and (**) combine to give

$\displaystyle [I]_\beta^{\beta'} \beta^{\leftarrow}\mathbf{x}=\beta'^{\leftarrow}\m athbf{x}$
$\displaystyle \Rightarrow [I]_\beta^{\beta'} \beta^{\leftarrow}=\beta'^{\leftarrow}$
$\displaystyle \Rightarrow [I]_\beta^{\beta'}=\beta'^{\leftarrow}\beta$

This seems like a nice neat result to me, but if $\displaystyle \beta'=A\beta$ as it is in the book, we have

$\displaystyle [I]_\beta^{\beta'}=\beta'^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{A \beta}=(A\beta)^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{A \beta}=\beta^{\leftarrow}A^{\leftarrow}\beta$
$\displaystyle \not=A^{\leftarrow}$

However, if $\displaystyle \beta'=\beta A$
$\displaystyle [I]_\beta^{\beta A}=\beta'^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{\beta A}=(\beta A)^{\leftarrow}\beta$
$\displaystyle \Rightarrow [I]_\beta^{\beta A}=A^{\leftarrow}\beta^{\leftarrow}\beta$
$\displaystyle =A^{\leftarrow}$

which is the required result.....

I have tried some basic examples with actual numbers and the results support what I have here... Unless I have some fundamental misunderstanding of all this and what it is supposed to mean, which is quite possible...
I hate the notation tensor analysts use. What does the arrow in $\displaystyle A^{\leftarrow}$ mean?

3. It is just the inverse of A, ie $\displaystyle A^{-1}$

4. ## An Example

Just to make things a little simpler I'll give an example and hopefully
somebody will be able to tell me what I'm doing wrong.

Given a basis, $\displaystyle \beta$ for X

$\displaystyle $\beta=\left(\begin{array}{cc} 1 & \frac{1}{2}\\ 1 & 1\end{array}\right)\text{ so that }\beta^{-1}=\left(\begin{array}{rr} 2 & -1\\ -2 & 2\end{array}\right)$$

and a linear isomorphic map $\displaystyle A:X\rightarrow X$

$\displaystyle $A=\left(\begin{array}{rr} 1 & 2\\ 1 & 1\end{array}\right)\text{ so that }A^{-1}=\left(\begin{array}{rr} -1 & 2\\ 1 & -1\end{array}\right)$$

Case (1):

Given a new basis $\displaystyle \beta'=A\beta$

$\displaystyle $\beta'=\left(\begin{array}{rr} 1 & 2\\ 1 & 1\end{array}\right)\left(\begin{array}{rr} 1 & \frac{1}{2}\\ 1 & 1\end{array}\right)=\left(\begin{array}{rr} 3 & \frac{5}{2}\\ 2 & \frac{3}{2}\end{array}\right)$$

and a vector $\displaystyle$\mathbf{x}=\left(\begin{array}{c}
3\\
2\end{array}\right)$$chosen so that we know that \displaystyle x^{\beta'}=\left(\begin{array}{c} 1\\ 0\end{array}\right)$$

Now, according to the book, $\displaystyle$A^{-1}x^{\beta}=x^{\beta'}$$, so I'll try it and see if it works First I need to find \displaystyle x^{\beta}$$

$\displaystyle $x^{\beta}=\beta^{-1}\mathbf{x}=\left(\begin{array}{rr} 2 & -1\\ -2 & 2\end{array}\right)\left(\begin{array}{c} 3\\ 2\end{array}\right)=\left(\begin{array}{r} 4\\ -2\end{array}\right)$$

now,

$\displaystyle $A^{-1}x^{\beta}=\left(\begin{array}{rr} -1 & 2\\ 1 & -1\end{array}\right)\left(\begin{array}{r} 4\\ -2\end{array}\right)=\left(\begin{array}{r} -8\\ 6\end{array}\right)\neq\left(\begin{array}{r} 1\\ 0\end{array}\right)$$

so it doesn't work!

Case (2):

Do the same but with $\displaystyle$\beta'=\beta A$$\displaystyle $\beta'=\left(\begin{array}{rr} 1 & \frac{1}{2}\\ 1 & 1\end{array}\right)\left(\begin{array}{rr} 1 & 2\\ 1 & 1\end{array}\right)=\left(\begin{array}{rr} \frac{3}{2} & \frac{5}{2}\\ 2 & 3\end{array}\right)$ and a vector \displaystyle \mathbf{x}=\left(\begin{array}{c} \frac{3}{2}\\ 2\end{array}\right)$$ chosen so that we know that $\displaystyle$x^{\beta'}=\left(\begin{array}{c}
1\\
0\end{array}\right)$$Finding \displaystyle x^{\beta}$$

$\displaystyle $x^{\beta}=\beta^{-1}\mathbf{x}=\left(\begin{array}{rr} 2 & -1\\ -2 & 2\end{array}\right)\left(\begin{array}{c} \frac{3}{2}\\ 2\end{array}\right)=\left(\begin{array}{r} 1\\ 1\end{array}\right)$$

now,

$\displaystyle $A^{-1}x^{\beta}=\left(\begin{array}{rr} -1 & 2\\ 1 & -1\end{array}\right)\left(\begin{array}{r} 1\\ 1\end{array}\right)=\left(\begin{array}{r} 1\\ 0\end{array}\right)$$

It seems to work for $\displaystyle$\beta'=\beta A$$but not \displaystyle \beta'=A\beta$$.

5. I've figured it out...
So just for the sake of completeness and if anyone is interested I'll post the conclusion to this problem.

My mistake was in thinking that $\displaystyle A=\left[A\right]_{\beta}^{\beta}$.

The map $\displaystyle A:X\rightarrow X$ maps a vector in the vector space X to a new vector in X.

Wheras $\displaystyle \left[A\right]_{\beta}^{\beta}$ maps the components of a vector in the $\displaystyle \beta$ basis to new components in the $\displaystyle \beta$ basis.

The result is the same but the maps are different.

You can do the map $\displaystyle \left[A\right]_{\beta}^{\beta}$ in terms of $\displaystyle A$ by first converting components into a vector ($\displaystyle \beta(x^{\beta})$), then applying A ($\displaystyle A(\beta x^{\beta})$) and then converting back into components ($\displaystyle \beta^{-1}(A\beta x^{\beta})$).

ie

$\displaystyle \left[A\right]_{\beta}^{\beta}=\beta^{-1}A\beta$

My result in my first post was $\displaystyle \left[I\right]_{\beta}^{A\beta}=\beta^{-1}A^{-1}\beta$ which completely confused me (I was expecting $\displaystyle \left[I\right]_{\beta}^{A\beta}=A^{-1}$)

But this was quite correct and can be taken further:

\displaystyle \begin{aligned} \left[I\right]_{\beta}^{A\beta} & =\beta^{-1}A^{-1}\beta\\ & =(A\beta)^{-1}\beta\\ & =(\beta^{-1}A\beta)^{-1}\\ & =(\left[A\right]_{\beta}^{\beta})^{-1}\end{aligned}
Which is the required result.

So there you go...
Mystery solved!