Originally Posted by

**Showcase_22** Wow! I may mean that, but I have never seen that notation before!

The book is "Introduction to Vectors and Tensors: Volume 1". The authors are Ray M Bowen and C C Wang.

I'm trying to show that an isomorphism exists between $\displaystyle \vartheta_1^1(V)$ (the set of tensors of order (1,1)) and $\displaystyle L(V;V)$ (the set of linear maps from V to V).

The first part works quite well. We know that $\displaystyle \dim \vartheta_1^1(V)=N^2$ where $\displaystyle \dim V = N$.

Now we have to show that $\displaystyle \dim L(V;V)=N^2$. To do this let $\displaystyle e_1, \ldots , e_N$ be a basis for V. Define $\displaystyle N^2$ linear transformations $\displaystyle A^k_{\apha} :V \rightarrow U$ by

$\displaystyle A^k_{\alpha}e_k=e_{\alpha}$ for $\displaystyle k, \alpha=1, \ldots , N$

$\displaystyle A^k_{\alpha} e_p=0$ for $\displaystyle k \neq p$

If A is an arbitrary member of $\displaystyle L(V;V)$ then $\displaystyle Ae_k \in V$ so

$\displaystyle Ae_k=\sum_{\alpha=1}^N A^{\alpha}_k e_{\alpha}$ where $\displaystyle k=1, \ldots ,N$.

But we also know that $\displaystyle A^k_{\alpha}e_k=e_{\alpha}$. This gives that:

$\displaystyle Ae_k=\sum_{\alpha=1}^N A^{\alpha}_k e_{\alpha}=\sum_{\alpha=1}^N A^{\alpha}_k A^k_{\alpha}e_k=\sum_{\alpha=1}^N \sum_{s=1}^N A^{\alpha}_s A^s_{\alpha}e_k$

We can rearrange this to get:

$\displaystyle \left( A-\sum_{\alpha=1}^N \sum_{s=1}^N A^{\alpha}_s A^s_{\alpha} \right)e_k=0$ valid $\displaystyle \forall e_k \in \{e_1, \ldots ,e_N \}$

But we know that A is a linear map so the above statement is valid for any $\displaystyle v \in V$. ie.

$\displaystyle \left( A-\sum_{\alpha=1}^N \sum_{s=1}^N A^{\alpha}_s A^s_{\alpha} \right)v =0$ valid $\displaystyle \forall v \in V$

This implies that $\displaystyle A= \sum_{\alpha=1}^N \sum_{s=1}^NA^{\alpha}_s A^s_{\alpha}$ meaning that the $\displaystyle N^2$ linear transformations we defined at the start generate $\displaystyle L(V;V)$.

Now we have to prove that these transformations are linearly independent. We do this by setting

$\displaystyle \sum_{\alpha=1}^N \sum_{s=1}^NA^{\alpha}_s A^s_{\alpha}=0$.

From here we can use the $\displaystyle N^2$ linear transformations from before to get:

$\displaystyle \sum_{\alpha=1}^N \sum_{s=1}^NA^{\alpha}_s A^s_{\alpha}(e_p)=\sum_{\alpha=1}^MA^{\alpha}_p e_{\alpha}=0$.

But since $\displaystyle e_{\alpha}$ are basis vectors we know that $\displaystyle e_{\alpha} \neq 0$. Therefore $\displaystyle A^{\alpha}_p=0$ where $\displaystyle p, alpha = 1, \ldots ,M$.

Therefore the set of $\displaystyle A^s_{\alpha}$ is a basis of $\displaystyle L(V;V)$.

This gives that $\displaystyle \dim L(V;V) =\left( \dim V \right)^2=N^2$.

I find it weird that you went through all this trouble to prove a very elementary fact from linear algebra which, I presume, must be assumed when you reach the necessary level to mess with tensors...

There is also a theorem in the book stating that if two vector spaces have the same dimension then an isomorphism exists between the two. In this case $\displaystyle \dim L(V;V)= \dim \vartheta_1^1(V)=N^2$ so an isomorphism exists by the theorem.

"...two vector spaces OVER THE SAME FIELD..."

Finally, I actually want to find an example of an isomorphism between $\displaystyle L(V;V)$ and $\displaystyle \vartheta_1^1(V)$. The book then defines the function $\displaystyle

\hat{A}: V^* \times V \rightarrow \mathbb{R}

$ where $\displaystyle

\hat{A}(v^*,v) \equiv <v^*, Av>

$ and $\displaystyle A$ is an endomorphism of V.

Since the two vector spaces of the same dimensions, by the pigeonhole principle we have to show that $\displaystyle \hat{A}$ is injective (thus implying that it's a bijection).

The book does this by setting $\displaystyle \hat{A}=0 \Rightarrow <v^*,Av>=0 \ \forall v^* \in V^*$ and $\displaystyle v \in V$ (I thought this was confusing, how does this prove that $\displaystyle \hat{A}$ is injective?)

Since the scalar product is definite (?) this gives $\displaystyle Av=0 \ \forall v \in V$ and thus$\displaystyle A=0$.

Consequently the operation "hat" is an isomorphism.

The last part is what i'm particularly confused about. I don't see how setting it to 0 will show that it's injective.

Sorry for the lengthy post, I figured it would be better if I showed exactly what i've done so far!