Results 1 to 12 of 12

Math Help - Jacobi identity

  1. #1
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7

    Jacobi identity

    Let u,v,w \in \mathbb{R}^3. Prove that u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0, where  \times is the cross product of vectors.

    Note: The proof should be as indirect as possible.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor alexmahone's Avatar
    Joined
    Oct 2008
    Posts
    1,074
    Thanks
    7
    Quote Originally Posted by NonCommAlg View Post
    Let u,v,w \in \mathbb{R}^3. Prove that u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0, where  \times is the cross product of vectors.

    Note: The proof should be as indirect as possible.
    u \times (v \times w)=v(u.w)-w(u.v)

    Similarly,

    v \times (w \times u)=w(v.u)-u(v.w)

    w \times (u \times v)=u(w.v)-v(w.u)

    Adding these 3 equations,

    u \times (v \times w)+u \times (v \times w)+w \times (u \times v)=0

    QED
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Super Member
    Joined
    Jan 2009
    Posts
    715
    My solution was wrong , Sorry!
    Last edited by simplependulum; September 8th 2009 at 06:58 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member pankaj's Avatar
    Joined
    Jul 2008
    From
    New Delhi(India)
    Posts
    318
    Quote Originally Posted by NonCommAlg View Post

    Note: The proof should be as indirect as possible.
    Now what does this mean
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by pankaj View Post
    Now what does this mean
    means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!

    Quote Originally Posted by alexmahone

    u \times (v \times w)=v(u.w)-w(u.v)
    and how would you prove that?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Banned
    Joined
    Aug 2009
    Posts
    143
    Quote Originally Posted by NonCommAlg View Post
    means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!



    and how would you prove that?
    Are you talking about using this

     u = u_1 e_1 + u_2 e_2 + u_3 e_3
     v = v_1 e_1 + v_2 e_2 + v_3 e_3
     w = w_1 e_1 + w_2 e_2 + w_3 e_3

    and

     e_i \times e_i = 0

     e_1 \times e_2 = e_3, \ \ e_1 \times e_3 = -e_2, \ \ e_2 \times e_1 = -e_3, \ \ e_2 \times e_3 = e_1, \ \ e_3 \times e_1 = e_2, \ \ e_3 \times e_2 = -e_1

    Then

     v \times w = v_1 w_2 \, e_1\times e_2 + v_1 w_3 \, e_1 \times e_3 +
    v_2 w_1 \, e_2\times e_1 + v_2 w_3 \, e_2 \times e_3 +<br />
v_3 w_1 \, e_3\times e_1 + v_3 w_2 \, e_3 \times e_2 = (v_2w_3-v_3w_2) \; e_1+(v_3w_1-v_1w_3) \; e_2 + (v_1w_2-v_2w_1) \; e_3

     u \times (v \times w) = \ \
    (u_2v_1w_2-u_2v_2w_1 - u_3v_3w_1+u_3v_1w_3) \; e_1 + (u_3v_2w_3-u_3v_3w_2-u_1v_1w_2+u_1v_2w_1) \; e_2 + (u_1v_3w_1-u_1v_1w_3-u_2v_2w_3+u_2v_3w_2) \; e_3

    Similarly,

     v \times (w \times u) = \ \ (v_2w_1u_2-v_2w_2u_1 - v_3w_3u_1+v_3w_1u_3) \; e_1 + (v_3w_2u_3-v_3w_3u_2-v_1w_1u_2+v_1w_2u_1) \; e_2 + (v_1w_3u_1-v_1w_1u_3-v_2w_2u_3+v_2w_3u_2) \; e_3

     w \times (u \times v) = \ \ (w_2u_1v_2-w_2u_2v_1 - w_3u_3v_1+w_3u_1v_3) \; e_1 + (w_3u_2v_3-w_3u_3v_2-w_1u_1v_2+w_1u_2v_1) \; e_2 + (w_1u_3v_1-w_1u_1v_3-w_2u_2v_3+w_2u_3v_2) \; e_3

    Finally,

    u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0




    Oops, guess this is ugly enough, but I need to take a rest for now.
    Last edited by mr fantastic; September 18th 2009 at 09:12 AM. Reason: Restored original reply
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Banned
    Joined
    Aug 2009
    Posts
    143
    Quote Originally Posted by NonCommAlg View Post
    means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!



    and how would you prove that?
    How about using Lie Algebras?

    http://www.ma.ic.ac.uk/~pavl/Jacobi.pdf

    http://www.springerlink.com/content/j058137n07661876/

    http://www.springerlink.com/content/xk6mp37073689238/

    http://adsabs.harvard.edu/abs/2002physics..10074L

    - http://books.google.com/books?id=irt...entity&f=false
    Last edited by mr fantastic; September 18th 2009 at 09:13 AM. Reason: Restored original reply
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    it's the other way round: we basically need to prove Jacobi identity (plus the trivial fact that u \times u = 0, for all u \in \mathbb{R}^3) in order to show that (\mathbb{R}^3, +, \times) is a Lie algebra (over \mathbb{R}).
    Last edited by NonCommAlg; September 9th 2009 at 07:34 AM.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Member
    Joined
    Mar 2009
    Posts
    91
    How about this.

    Warning: I'm relying on the identity \mathbf a*(\mathbf b\times\mathbf c)=(\mathbf a\times\mathbf b)*\mathbf c. This is proved by considering the identity (\mathbf c+\mathbf a)*[(\mathbf c+\mathbf a)\times \mathbf b]=0.

    First assume that \mathbf u, \mathbf v and \mathbf w are linearly dependent. WLOG assume there are scalars a, b with \mathbf w=a\mathbf u+b\mathbf v.

    Then \mathbf u\times(\mathbf v\times\mathbf w)=a\mathbf u\times(\mathbf v\times\mathbf u) and \mathbf v\times(\mathbf w\times\mathbf u)=b\mathbf v\times(\mathbf v\times\mathbf u) and so

    \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)=(a\mathbf u+b\mathbf v)\times(\mathbf v\times\mathbf u)=\mathbf w\times(\mathbf v\times\mathbf u)=-\mathbf w\times(\mathbf u\times\mathbf v), proving the result in this case.

    Now assume that \mathbf u, \mathbf v and \mathbf w are linearly independent, thus forming a basis for \mathbb R^3.

    We have \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)]=0 and \mathbf u*[\mathbf v\times(\mathbf w\times\mathbf u)]=(\mathbf u\times\mathbf v)*(\mathbf w\times\mathbf u)=-\mathbf u*[\mathbf w\times(\mathbf u\times\mathbf v)].

    Hence \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0.

    Similarly \mathbf v*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0 and \mathbf w*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0.

    Since \mathbf u, \mathbf v and \mathbf w form a basis and \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v) is orthogonal to all of them, the result follows.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by halbard View Post
    How about this.

    Warning: I'm relying on the identity \mathbf a*(\mathbf b\times\mathbf c)=(\mathbf a\times\mathbf b)*\mathbf c. This is proved by considering the identity (\mathbf c+\mathbf a)*[(\mathbf c+\mathbf a)\times \mathbf b]=0.

    First assume that \mathbf u, \mathbf v and \mathbf w are linearly dependent. WLOG assume there are scalars a, b with \mathbf w=a\mathbf u+b\mathbf v.

    Then \mathbf u\times(\mathbf v\times\mathbf w)=a\mathbf u\times(\mathbf v\times\mathbf u) and \mathbf v\times(\mathbf w\times\mathbf u)=b\mathbf v\times(\mathbf v\times\mathbf u) and so

    \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)=(a\mathbf u+b\mathbf v)\times(\mathbf v\times\mathbf u)=\mathbf w\times(\mathbf v\times\mathbf u)=-\mathbf w\times(\mathbf u\times\mathbf v), proving the result in this case.

    Now assume that \mathbf u, \mathbf v and \mathbf w are linearly independent, thus forming a basis for \mathbb R^3.

    We have \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)]=0 and \mathbf u*[\mathbf v\times(\mathbf w\times\mathbf u)]=(\mathbf u\times\mathbf v)*(\mathbf w\times\mathbf u)=-\mathbf u*[\mathbf w\times(\mathbf u\times\mathbf v)].

    Hence \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0.

    Similarly \mathbf v*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0 and \mathbf w*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0.

    Since \mathbf u, \mathbf v and \mathbf w form a basis and \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v) is orthogonal to all of them, the result follows.
    nice!
    Follow Math Help Forum on Facebook and Google+

  11. #11
    MHF Contributor Bruno J.'s Avatar
    Joined
    Jun 2009
    From
    Canada
    Posts
    1,266
    Thanks
    1
    Awards
    1
    Here is a proof. Consider the special orthogonal group SO(3, \mathbb{R}) as a Lie group. Its Lie algebra \bold{so}(3, \mathbb{R}) is the set of 3\times 3 skew-symmetric matrices, and the Lie bracket is the commutator [A,B]=AB-BA. A 3\times 3 skew-symmetric matrix A can be associated in a natural way with a vector a in \mathbb{R}^3, and it is easy to check that the vector associated to the bracket [A,B] is a \times b. By the Jacobi identity for the Lie bracket, the result follows.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by Bruno J. View Post
    Here is a proof. Consider the special orthogonal group SO(3, \mathbb{R}) as a Lie group. Its Lie algebra \bold{so}(3, \mathbb{R}) is the set of 3\times 3 skew-symmetric matrices, and the Lie bracket is the commutator [A,B]=AB-BA. A 3\times 3 skew-symmetric matrix A can be associated in a natural way with a vector a in \mathbb{R}^3, and it is easy to check that the vector associated to the bracket [A,B] is a \times b. By the Jacobi identity for the Lie bracket, the result follows.
    this is a clever solution with lots of details! in simple words: first we note that if S \subseteq R, where R is an associative ring, and if we define [s,t]=st - ts, for all s,t \in S, then the

    Jacobi identity holds, i.e. [r,[s,t]] + [s,[t,r]] + [t,[r,s]]=0, \ \ \forall r,s,t \in S. now let S be the set of skew-symmetric matrices with real entries, i.e. S \subset M_3(\mathbb{R}) consists of all matrices

    in the form A_{x,y,z}=\begin{pmatrix}0 & -x & -y \\ x & 0 & -z \\ y & z & 0 \end{pmatrix}, \ \ \ x,y,z \in \mathbb{R}. define the map f: S \longrightarrow \mathbb{R}^3 by f(A_{x,y,z})=x \bold{i} + y \bold{j} + z \bold{k}. clearly f is an isomorphism of vector spaces and it's easy to see that f([A_{x,y,z}, A_{x',y',z'}])=(x \bold{i} + y \bold{j} + z \bold{k}) \times (x' \bold{i} + y' \bold{j} + z' \bold{k}). call this result (1). since M_3(\mathbb{R}) is an associative ring, we have [A,[B,C]]+[B,[C,A]]+[C,[A,B]]=0, \ \forall A,B,C \in S.

    taking f of both sides and using the linearity of f and (1) gives us: f(A) \times (f(B) \times f(C)) + f(B) \times (f(C) \times f(A)) + f(C) \times (f(A) \times f(B))=0, which completes the proof.
    Last edited by NonCommAlg; October 8th 2009 at 04:07 PM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Jacobi Symbol
    Posted in the Number Theory Forum
    Replies: 12
    Last Post: March 8th 2010, 09:53 PM
  2. An identity of Jacobi
    Posted in the Advanced Algebra Forum
    Replies: 8
    Last Post: August 18th 2009, 11:09 PM
  3. [SOLVED] Jacobi Symbol
    Posted in the Number Theory Forum
    Replies: 2
    Last Post: December 13th 2008, 08:52 PM
  4. Jacobi Symbol
    Posted in the Number Theory Forum
    Replies: 6
    Last Post: May 1st 2008, 08:04 PM
  5. Jacobi identity
    Posted in the Advanced Math Topics Forum
    Replies: 2
    Last Post: August 6th 2006, 09:08 PM

Search Tags


/mathhelpforum @mathhelpforum