# Jacobi identity

• Sep 7th 2009, 05:56 PM
NonCommAlg
Jacobi identity
Let $\displaystyle u,v,w \in \mathbb{R}^3.$ Prove that $\displaystyle u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0,$ where $\displaystyle \times$ is the cross product of vectors.

Note: The proof should be as indirect as possible.
• Sep 7th 2009, 06:25 PM
alexmahone
Quote:

Originally Posted by NonCommAlg
Let $\displaystyle u,v,w \in \mathbb{R}^3.$ Prove that $\displaystyle u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0,$ where $\displaystyle \times$ is the cross product of vectors.

Note: The proof should be as indirect as possible.

$\displaystyle u \times (v \times w)=v(u.w)-w(u.v)$

Similarly,

$\displaystyle v \times (w \times u)=w(v.u)-u(v.w)$

$\displaystyle w \times (u \times v)=u(w.v)-v(w.u)$

$\displaystyle u \times (v \times w)+u \times (v \times w)+w \times (u \times v)=0$

QED
• Sep 7th 2009, 09:36 PM
simplependulum
My solution was wrong , Sorry!
• Sep 8th 2009, 08:41 AM
pankaj
Quote:

Originally Posted by NonCommAlg

Note: The proof should be as indirect as possible.

Now what does this mean
• Sep 8th 2009, 03:49 PM
NonCommAlg
Quote:

Originally Posted by pankaj
Now what does this mean

means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!

Quote:

Originally Posted by alexmahone

$\displaystyle u \times (v \times w)=v(u.w)-w(u.v)$

and how would you prove that?
• Sep 8th 2009, 06:30 PM
luobo
Quote:

Originally Posted by NonCommAlg
means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!

and how would you prove that?

Are you talking about using this

$\displaystyle u = u_1 e_1 + u_2 e_2 + u_3 e_3$
$\displaystyle v = v_1 e_1 + v_2 e_2 + v_3 e_3$
$\displaystyle w = w_1 e_1 + w_2 e_2 + w_3 e_3$

and

$\displaystyle e_i \times e_i = 0$

$\displaystyle e_1 \times e_2 = e_3, \ \ e_1 \times e_3 = -e_2, \ \ e_2 \times e_1 = -e_3, \ \ e_2 \times e_3 = e_1, \ \ e_3 \times e_1 = e_2, \ \ e_3 \times e_2 = -e_1$

Then

$\displaystyle v \times w = v_1 w_2 \, e_1\times e_2 + v_1 w_3 \, e_1 \times e_3 +$
$\displaystyle v_2 w_1 \, e_2\times e_1 + v_2 w_3 \, e_2 \times e_3 + v_3 w_1 \, e_3\times e_1 + v_3 w_2 \, e_3 \times e_2 $$\displaystyle = (v_2w_3-v_3w_2) \; e_1+(v_3w_1-v_1w_3) \; e_2 + (v_1w_2-v_2w_1) \; e_3 \displaystyle u \times (v \times w) = \ \ \displaystyle (u_2v_1w_2-u_2v_2w_1 - u_3v_3w_1+u_3v_1w_3) \; e_1 +$$\displaystyle (u_3v_2w_3-u_3v_3w_2-u_1v_1w_2+u_1v_2w_1) \; e_2 + $$\displaystyle (u_1v_3w_1-u_1v_1w_3-u_2v_2w_3+u_2v_3w_2) \; e_3 Similarly, \displaystyle v \times (w \times u) = \ \$$\displaystyle (v_2w_1u_2-v_2w_2u_1 - v_3w_3u_1+v_3w_1u_3) \; e_1 + $$\displaystyle (v_3w_2u_3-v_3w_3u_2-v_1w_1u_2+v_1w_2u_1) \; e_2 +$$\displaystyle (v_1w_3u_1-v_1w_1u_3-v_2w_2u_3+v_2w_3u_2) \; e_3$

$\displaystyle w \times (u \times v) = \ \ $$\displaystyle (w_2u_1v_2-w_2u_2v_1 - w_3u_3v_1+w_3u_1v_3) \; e_1 +$$\displaystyle (w_3u_2v_3-w_3u_3v_2-w_1u_1v_2+w_1u_2v_1) \; e_2 +$$\displaystyle (w_1u_3v_1-w_1u_1v_3-w_2u_2v_3+w_2u_3v_2) \; e_3$

Finally,

$\displaystyle u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0$

Oops, guess this is ugly enough, but I need to take a rest for now.
• Sep 8th 2009, 07:19 PM
luobo
Quote:

Originally Posted by NonCommAlg
means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!

and how would you prove that?

http://www.ma.ic.ac.uk/~pavl/Jacobi.pdf

• Sep 9th 2009, 05:17 AM
NonCommAlg
Quote:

Originally Posted by luobo

it's the other way round: we basically need to prove Jacobi identity (plus the trivial fact that $\displaystyle u \times u = 0,$ for all $\displaystyle u \in \mathbb{R}^3$) in order to show that $\displaystyle (\mathbb{R}^3, +, \times)$ is a Lie algebra (over $\displaystyle \mathbb{R}$).
• Sep 12th 2009, 08:49 AM
halbard

Warning: I'm relying on the identity $\displaystyle \mathbf a*(\mathbf b\times\mathbf c)=(\mathbf a\times\mathbf b)*\mathbf c$. This is proved by considering the identity $\displaystyle (\mathbf c+\mathbf a)*[(\mathbf c+\mathbf a)\times \mathbf b]=0$.

First assume that $\displaystyle \mathbf u$, $\displaystyle \mathbf v$ and $\displaystyle \mathbf w$ are linearly dependent. WLOG assume there are scalars $\displaystyle a$, $\displaystyle b$ with $\displaystyle \mathbf w=a\mathbf u+b\mathbf v$.

Then $\displaystyle \mathbf u\times(\mathbf v\times\mathbf w)=a\mathbf u\times(\mathbf v\times\mathbf u)$ and $\displaystyle \mathbf v\times(\mathbf w\times\mathbf u)=b\mathbf v\times(\mathbf v\times\mathbf u)$ and so

$\displaystyle \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)=(a\mathbf u+b\mathbf v)\times(\mathbf v\times\mathbf u)=\mathbf w\times(\mathbf v\times\mathbf u)=-\mathbf w\times(\mathbf u\times\mathbf v)$, proving the result in this case.

Now assume that $\displaystyle \mathbf u$, $\displaystyle \mathbf v$ and $\displaystyle \mathbf w$ are linearly independent, thus forming a basis for $\displaystyle \mathbb R^3$.

We have $\displaystyle \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)]=0$ and $\displaystyle \mathbf u*[\mathbf v\times(\mathbf w\times\mathbf u)]=(\mathbf u\times\mathbf v)*(\mathbf w\times\mathbf u)=-\mathbf u*[\mathbf w\times(\mathbf u\times\mathbf v)]$.

Hence $\displaystyle \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Similarly $\displaystyle \mathbf v*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$ and $\displaystyle \mathbf w*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Since $\displaystyle \mathbf u$, $\displaystyle \mathbf v$ and $\displaystyle \mathbf w$ form a basis and $\displaystyle \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)$ is orthogonal to all of them, the result follows.
• Sep 13th 2009, 02:16 AM
NonCommAlg
Quote:

Originally Posted by halbard

Warning: I'm relying on the identity $\displaystyle \mathbf a*(\mathbf b\times\mathbf c)=(\mathbf a\times\mathbf b)*\mathbf c$. This is proved by considering the identity $\displaystyle (\mathbf c+\mathbf a)*[(\mathbf c+\mathbf a)\times \mathbf b]=0$.

First assume that $\displaystyle \mathbf u$, $\displaystyle \mathbf v$ and $\displaystyle \mathbf w$ are linearly dependent. WLOG assume there are scalars $\displaystyle a$, $\displaystyle b$ with $\displaystyle \mathbf w=a\mathbf u+b\mathbf v$.

Then $\displaystyle \mathbf u\times(\mathbf v\times\mathbf w)=a\mathbf u\times(\mathbf v\times\mathbf u)$ and $\displaystyle \mathbf v\times(\mathbf w\times\mathbf u)=b\mathbf v\times(\mathbf v\times\mathbf u)$ and so

$\displaystyle \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)=(a\mathbf u+b\mathbf v)\times(\mathbf v\times\mathbf u)=\mathbf w\times(\mathbf v\times\mathbf u)=-\mathbf w\times(\mathbf u\times\mathbf v)$, proving the result in this case.

Now assume that $\displaystyle \mathbf u$, $\displaystyle \mathbf v$ and $\displaystyle \mathbf w$ are linearly independent, thus forming a basis for $\displaystyle \mathbb R^3$.

We have $\displaystyle \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)]=0$ and $\displaystyle \mathbf u*[\mathbf v\times(\mathbf w\times\mathbf u)]=(\mathbf u\times\mathbf v)*(\mathbf w\times\mathbf u)=-\mathbf u*[\mathbf w\times(\mathbf u\times\mathbf v)]$.

Hence $\displaystyle \mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Similarly $\displaystyle \mathbf v*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$ and $\displaystyle \mathbf w*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Since $\displaystyle \mathbf u$, $\displaystyle \mathbf v$ and $\displaystyle \mathbf w$ form a basis and $\displaystyle \mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)$ is orthogonal to all of them, the result follows.

nice! (Clapping)
• Oct 7th 2009, 10:38 PM
Bruno J.
Here is a proof. Consider the special orthogonal group $\displaystyle SO(3, \mathbb{R})$ as a Lie group. Its Lie algebra $\displaystyle \bold{so}(3, \mathbb{R})$ is the set of $\displaystyle 3\times 3$ skew-symmetric matrices, and the Lie bracket is the commutator $\displaystyle [A,B]=AB-BA$. A $\displaystyle 3\times 3$ skew-symmetric matrix $\displaystyle A$ can be associated in a natural way with a vector $\displaystyle a$ in $\displaystyle \mathbb{R}^3$, and it is easy to check that the vector associated to the bracket $\displaystyle [A,B]$ is $\displaystyle a \times b$. By the Jacobi identity for the Lie bracket, the result follows.
• Oct 8th 2009, 12:20 AM
NonCommAlg
Quote:

Originally Posted by Bruno J.
Here is a proof. Consider the special orthogonal group $\displaystyle SO(3, \mathbb{R})$ as a Lie group. Its Lie algebra $\displaystyle \bold{so}(3, \mathbb{R})$ is the set of $\displaystyle 3\times 3$ skew-symmetric matrices, and the Lie bracket is the commutator $\displaystyle [A,B]=AB-BA$. A $\displaystyle 3\times 3$ skew-symmetric matrix $\displaystyle A$ can be associated in a natural way with a vector $\displaystyle a$ in $\displaystyle \mathbb{R}^3$, and it is easy to check that the vector associated to the bracket $\displaystyle [A,B]$ is $\displaystyle a \times b$. By the Jacobi identity for the Lie bracket, the result follows.

this is a clever solution with lots of details! (Nod) in simple words: first we note that if $\displaystyle S \subseteq R,$ where $\displaystyle R$ is an associative ring, and if we define $\displaystyle [s,t]=st - ts,$ for all $\displaystyle s,t \in S,$ then the

Jacobi identity holds, i.e. $\displaystyle [r,[s,t]] + [s,[t,r]] + [t,[r,s]]=0, \ \ \forall r,s,t \in S.$ now let $\displaystyle S$ be the set of skew-symmetric matrices with real entries, i.e. $\displaystyle S \subset M_3(\mathbb{R})$ consists of all matrices

in the form $\displaystyle A_{x,y,z}=\begin{pmatrix}0 & -x & -y \\ x & 0 & -z \\ y & z & 0 \end{pmatrix}, \ \ \ x,y,z \in \mathbb{R}.$ define the map $\displaystyle f: S \longrightarrow \mathbb{R}^3$ by $\displaystyle f(A_{x,y,z})=x \bold{i} + y \bold{j} + z \bold{k}.$ clearly $\displaystyle f$ is an isomorphism of vector spaces and it's easy to see that $\displaystyle f([A_{x,y,z}, A_{x',y',z'}])=(x \bold{i} + y \bold{j} + z \bold{k}) \times (x' \bold{i} + y' \bold{j} + z' \bold{k}).$ call this result $\displaystyle (1).$ since $\displaystyle M_3(\mathbb{R})$ is an associative ring, we have $\displaystyle [A,[B,C]]+[B,[C,A]]+[C,[A,B]]=0, \ \forall A,B,C \in S.$

taking $\displaystyle f$ of both sides and using the linearity of $\displaystyle f$ and $\displaystyle (1)$ gives us: $\displaystyle f(A) \times (f(B) \times f(C)) + f(B) \times (f(C) \times f(A)) + f(C) \times (f(A) \times f(B))=0,$ which completes the proof.