Math Help - Trace of A squared linear function of A tensor A???

1. show that: Trace of a matrix squared [(TrA)^2], is a linear function of A tensor A

I have to show that if $A$ is a $(1,1)$ tensor then $(tr A)^2$ is a linear function of $A \otimes A$.

My approach:

If I define a function $f$ by $f(A \otimes A) = (A \otimes A)^{ij}_{ij}$ (einstein summation is used, and $(A \otimes A)^{ij}_{ij}$) is the coefficient respect to a basis, then I have:

$f((A \otimes A)) = (A \otimes A)^{ij}_{ij} = A^i_i A^j_j = (tr A)^2$

and

$f(\alpha A \otimes A + B \otimes B) = (\alpha A \otimes A + B \otimes B)^{ij}_{ij} =^* \alpha (A \otimes A)^{ij}_{ij} + (B \otimes B)^{ij}_{ij} =$
$\alpha f(A \otimes A) + f(B \otimes B)$

I'm not quite sure about the identity I have used at the $^*$, but maybe it follows from:

Let $e_j and \varepsilon^i$ be a basis such that

$A = A^i_j e_i \otimes \varepsilon^j$
and
$B = B^i_j e_i \otimes \varepsilon^j$

then

$\alpha A + B = \alpha A^i_j e_i \otimes \varepsilon^j + B^i_j e_i \otimes \varepsilon^j = (\alpha A^i_j + B^i_j) e_i \otimes \varepsilon^j$

by uniqueness of the coefficients i get:

$(\alpha A + B)^i_j = \alpha A^i_j + B^i_j$

I think I could proove this in the more general case wich I have used where I use a theorem that states that for any tensor (r,s) there are unique coefficients $A^{i_1 ... i_r}_{j_1 ... j_s}$, but there are so much indices so I don't want to wright it here.

2. I think $(tr A)^2$ is precisely $A \otimes A$. Just expand $(A_i^j e_i\otimes\epsilon^j)^2$.

3. I don't understand

$A \otimes A$ is a function on $V \times V \times V^* \times V^*$ if $V$ is the vectorspace.

And (trace A)^2 is a scalar so how do I compare the two things?

And how do you define taking the square of a tensor? By $B^2 = B \circ B$ or $B^2 = B \otimes B$ ?

Because how do you evaluate

$
(A_i^j e_i\otimes\epsilon^j)^2 = (A_i^j e_i\otimes\epsilon^j)(A_{\alpha}^{\beta} e_{\alpha}\otimes\epsilon^{\beta})
$

4. $
A \otimes A
$
is a function on
$V\times V^*$, since it is a (1,1) tensor ()

Because how do you evaluate
and you are right, I forgot to write the tr in that expression

But you have already solved the problem, by writing

$
(tr A)^2=A^i_i A^j_j=(A \otimes A)^{ij}_{ij}
$

as this is linear for $(A \otimes A)$. Same thing holds for more general tensors.

5. Originally Posted by Rebesques
$V\times V^*$, since it is a (1,1) tensor ()
But A is a (1,1) tensor so $A \otimes A$ is a (2,2)

and you are right, I forgot to write the tr in that expression

But you have already solved the problem, by writing

$
(tr A)^2=A^i_i A^j_j=(A \otimes A)^{ij}_{ij}
$

as this is linear for $(A \otimes A)$. Same thing holds for more general tensors.
ok so the way I solved it is ok? I'm just starting to read about tensors so I'm easy to confuse.

but I wouldn't say that:

$
(tr A)^2=A^i_i A^j_j=(A \otimes A)^{ij}_{ij}
$

is the same as:

$
A \otimes A
$

I have changed the index. Even though you just mean the coefficients of $
A \otimes A
$

the coefficients of $
A \otimes A
$
is
$
(A \otimes A)^{i \alpha}_{j \beta}
$

and I used
$
(A \otimes A)^{ij}_{ij}
$

I don't know if it's clear what I ment here???

6. But A is a (1,1) tensor so is a (2,2)
Hm, I don't think my Do Carmo textbook would agree. What's your textbook?

but I wouldn't say that:
is the same as:
Doesn't have to be, as we just need to show it's a linear function.

7. tensor analysis on manifolds by Bishop.

if you take the tensor product of two (1,1), wouldn't you get a (2,2).

ex. if f and g are functions from V x V*then

$f \otimes g = f(x,y)g(z,w)$

which is af function from V x V x V* x V*, a (2,2) tensor ?