Results 1 to 7 of 7

Math Help - Trace of A squared linear function of A tensor A???

  1. #1
    Newbie
    Joined
    Nov 2006
    Posts
    13

    show that: Trace of a matrix squared [(TrA)^2], is a linear function of A tensor A

    I have to show that if A is a (1,1) tensor then (tr A)^2 is a linear function of A \otimes A.

    My approach:

    If I define a function f by f(A \otimes A) = (A \otimes A)^{ij}_{ij} (einstein summation is used, and  (A \otimes A)^{ij}_{ij} ) is the coefficient respect to a basis, then I have:

    f((A \otimes A)) = (A \otimes A)^{ij}_{ij} = A^i_i A^j_j = (tr A)^2

    and

     f(\alpha A \otimes A + B \otimes B) = (\alpha A \otimes A + B \otimes B)^{ij}_{ij} =^* \alpha (A \otimes A)^{ij}_{ij} + (B \otimes B)^{ij}_{ij} =
    \alpha f(A \otimes A) + f(B \otimes B)

    I'm not quite sure about the identity I have used at the ^*, but maybe it follows from:

    Let e_j and \varepsilon^i be a basis such that

    A = A^i_j e_i \otimes \varepsilon^j
    and
    B = B^i_j e_i \otimes \varepsilon^j

    then

     \alpha A + B  = \alpha A^i_j e_i \otimes \varepsilon^j + B^i_j e_i \otimes \varepsilon^j = (\alpha A^i_j + B^i_j) e_i \otimes \varepsilon^j

    by uniqueness of the coefficients i get:

     (\alpha A + B)^i_j  = \alpha A^i_j + B^i_j

    I think I could proove this in the more general case wich I have used where I use a theorem that states that for any tensor (r,s) there are unique coefficients A^{i_1 ... i_r}_{j_1 ... j_s}, but there are so much indices so I don't want to wright it here.
    Last edited by mrandersdk; July 12th 2007 at 02:40 AM. Reason: Bad tittle
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member Rebesques's Avatar
    Joined
    Jul 2005
    From
    At my house.
    Posts
    542
    Thanks
    11
    I think (tr A)^2 is precisely A \otimes A. Just expand (A_i^j e_i\otimes\epsilon^j)^2.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Nov 2006
    Posts
    13
    I don't understand

    A \otimes A is a function on V \times V \times V^* \times V^* if V is the vectorspace.

    And (trace A)^2 is a scalar so how do I compare the two things?

    And how do you define taking the square of a tensor? By B^2 = B \circ B or B^2 = B \otimes B ?

    Because how do you evaluate

    <br />
(A_i^j e_i\otimes\epsilon^j)^2 = (A_i^j e_i\otimes\epsilon^j)(A_{\alpha}^{\beta} e_{\alpha}\otimes\epsilon^{\beta})<br />
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Super Member Rebesques's Avatar
    Joined
    Jul 2005
    From
    At my house.
    Posts
    542
    Thanks
    11
    <br />
A \otimes A<br />
is a function on
    V\times V^*, since it is a (1,1) tensor ()

    Because how do you evaluate
    and you are right, I forgot to write the tr in that expression

    But you have already solved the problem, by writing

    <br />
 (tr A)^2=A^i_i A^j_j=(A \otimes A)^{ij}_{ij}  <br />

    as this is linear for (A \otimes A). Same thing holds for more general tensors.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Nov 2006
    Posts
    13
    Quote Originally Posted by Rebesques View Post
    V\times V^*, since it is a (1,1) tensor ()
    But A is a (1,1) tensor so A \otimes A is a (2,2)

    and you are right, I forgot to write the tr in that expression

    But you have already solved the problem, by writing

    <br />
 (tr A)^2=A^i_i A^j_j=(A \otimes A)^{ij}_{ij}  <br />

    as this is linear for (A \otimes A). Same thing holds for more general tensors.
    ok so the way I solved it is ok? I'm just starting to read about tensors so I'm easy to confuse.

    but I wouldn't say that:

    <br />
(tr A)^2=A^i_i A^j_j=(A \otimes A)^{ij}_{ij}<br />

    is the same as:

    <br />
A \otimes A<br />

    I have changed the index. Even though you just mean the coefficients of <br />
A \otimes A<br />
    the coefficients of <br />
A \otimes A<br />
is
    <br />
(A \otimes A)^{i \alpha}_{j \beta}<br />
    and I used
    <br />
(A \otimes A)^{ij}_{ij}<br />

    I don't know if it's clear what I ment here???
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Super Member Rebesques's Avatar
    Joined
    Jul 2005
    From
    At my house.
    Posts
    542
    Thanks
    11
    But A is a (1,1) tensor so is a (2,2)
    Hm, I don't think my Do Carmo textbook would agree. What's your textbook?


    but I wouldn't say that:
    is the same as:
    Doesn't have to be, as we just need to show it's a linear function.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Nov 2006
    Posts
    13
    tensor analysis on manifolds by Bishop.

    if you take the tensor product of two (1,1), wouldn't you get a (2,2).

    ex. if f and g are functions from V x V*then

    f \otimes g = f(x,y)g(z,w)

    which is af function from V x V x V* x V*, a (2,2) tensor ?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Linear Regression - Least Squared Distances
    Posted in the Statistics Forum
    Replies: 17
    Last Post: June 25th 2011, 01:25 PM
  2. tensor and linear map
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: January 27th 2011, 12:46 PM
  3. [SOLVED] Linear Algebra, trace similar matrices
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: April 22nd 2010, 12:43 PM
  4. Trace of a tensor product
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: March 26th 2010, 04:15 AM
  5. least squared problem and linear algebra
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: June 8th 2005, 05:00 AM

Search Tags


/mathhelpforum @mathhelpforum