Jacobi identity

• September 7th 2009, 05:56 PM
NonCommAlg
Jacobi identity
Let $u,v,w \in \mathbb{R}^3.$ Prove that $u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0,$ where $\times$ is the cross product of vectors.

Note: The proof should be as indirect as possible.
• September 7th 2009, 06:25 PM
alexmahone
Quote:

Originally Posted by NonCommAlg
Let $u,v,w \in \mathbb{R}^3.$ Prove that $u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0,$ where $\times$ is the cross product of vectors.

Note: The proof should be as indirect as possible.

$u \times (v \times w)=v(u.w)-w(u.v)$

Similarly,

$v \times (w \times u)=w(v.u)-u(v.w)$

$w \times (u \times v)=u(w.v)-v(w.u)$

$u \times (v \times w)+u \times (v \times w)+w \times (u \times v)=0$

QED
• September 7th 2009, 09:36 PM
simplependulum
My solution was wrong , Sorry!
• September 8th 2009, 08:41 AM
pankaj
Quote:

Originally Posted by NonCommAlg

Note: The proof should be as indirect as possible.

Now what does this mean
• September 8th 2009, 03:49 PM
NonCommAlg
Quote:

Originally Posted by pankaj
Now what does this mean

means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!

Quote:

Originally Posted by alexmahone

$u \times (v \times w)=v(u.w)-w(u.v)$

and how would you prove that?
• September 8th 2009, 06:30 PM
luobo
Quote:

Originally Posted by NonCommAlg
means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!

and how would you prove that?

Are you talking about using this

$u = u_1 e_1 + u_2 e_2 + u_3 e_3$
$v = v_1 e_1 + v_2 e_2 + v_3 e_3$
$w = w_1 e_1 + w_2 e_2 + w_3 e_3$

and

$e_i \times e_i = 0$

$e_1 \times e_2 = e_3, \ \ e_1 \times e_3 = -e_2, \ \ e_2 \times e_1 = -e_3, \ \ e_2 \times e_3 = e_1, \ \ e_3 \times e_1 = e_2, \ \ e_3 \times e_2 = -e_1$

Then

$v \times w = v_1 w_2 \, e_1\times e_2 + v_1 w_3 \, e_1 \times e_3 +$
$v_2 w_1 \, e_2\times e_1 + v_2 w_3 \, e_2 \times e_3 +
v_3 w_1 \, e_3\times e_1 + v_3 w_2 \, e_3 \times e_2$
$= (v_2w_3-v_3w_2) \; e_1+(v_3w_1-v_1w_3) \; e_2 + (v_1w_2-v_2w_1) \; e_3$

$u \times (v \times w) = \ \$
$(u_2v_1w_2-u_2v_2w_1 - u_3v_3w_1+u_3v_1w_3) \; e_1 +$ $(u_3v_2w_3-u_3v_3w_2-u_1v_1w_2+u_1v_2w_1) \; e_2 +$ $(u_1v_3w_1-u_1v_1w_3-u_2v_2w_3+u_2v_3w_2) \; e_3$

Similarly,

$v \times (w \times u) = \ \$ $(v_2w_1u_2-v_2w_2u_1 - v_3w_3u_1+v_3w_1u_3) \; e_1 +$ $(v_3w_2u_3-v_3w_3u_2-v_1w_1u_2+v_1w_2u_1) \; e_2 +$ $(v_1w_3u_1-v_1w_1u_3-v_2w_2u_3+v_2w_3u_2) \; e_3$

$w \times (u \times v) = \ \$ $(w_2u_1v_2-w_2u_2v_1 - w_3u_3v_1+w_3u_1v_3) \; e_1 +$ $(w_3u_2v_3-w_3u_3v_2-w_1u_1v_2+w_1u_2v_1) \; e_2 +$ $(w_1u_3v_1-w_1u_1v_3-w_2u_2v_3+w_2u_3v_2) \; e_3$

Finally,

$u \times (v \times w) + v \times (w \times u) + w \times (u \times v) = 0$

Oops, guess this is ugly enough, but I need to take a rest for now.
• September 8th 2009, 07:19 PM
luobo
Quote:

Originally Posted by NonCommAlg
means "not using the definition of cross product and doing all that calculations" because that everybody could do and it's not even nice!

and how would you prove that?

http://www.ma.ic.ac.uk/~pavl/Jacobi.pdf

• September 9th 2009, 05:17 AM
NonCommAlg
Quote:

Originally Posted by luobo

it's the other way round: we basically need to prove Jacobi identity (plus the trivial fact that $u \times u = 0,$ for all $u \in \mathbb{R}^3$) in order to show that $(\mathbb{R}^3, +, \times)$ is a Lie algebra (over $\mathbb{R}$).
• September 12th 2009, 08:49 AM
halbard

Warning: I'm relying on the identity $\mathbf a*(\mathbf b\times\mathbf c)=(\mathbf a\times\mathbf b)*\mathbf c$. This is proved by considering the identity $(\mathbf c+\mathbf a)*[(\mathbf c+\mathbf a)\times \mathbf b]=0$.

First assume that $\mathbf u$, $\mathbf v$ and $\mathbf w$ are linearly dependent. WLOG assume there are scalars $a$, $b$ with $\mathbf w=a\mathbf u+b\mathbf v$.

Then $\mathbf u\times(\mathbf v\times\mathbf w)=a\mathbf u\times(\mathbf v\times\mathbf u)$ and $\mathbf v\times(\mathbf w\times\mathbf u)=b\mathbf v\times(\mathbf v\times\mathbf u)$ and so

$\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)=(a\mathbf u+b\mathbf v)\times(\mathbf v\times\mathbf u)=\mathbf w\times(\mathbf v\times\mathbf u)=-\mathbf w\times(\mathbf u\times\mathbf v)$, proving the result in this case.

Now assume that $\mathbf u$, $\mathbf v$ and $\mathbf w$ are linearly independent, thus forming a basis for $\mathbb R^3$.

We have $\mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)]=0$ and $\mathbf u*[\mathbf v\times(\mathbf w\times\mathbf u)]=(\mathbf u\times\mathbf v)*(\mathbf w\times\mathbf u)=-\mathbf u*[\mathbf w\times(\mathbf u\times\mathbf v)]$.

Hence $\mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Similarly $\mathbf v*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$ and $\mathbf w*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Since $\mathbf u$, $\mathbf v$ and $\mathbf w$ form a basis and $\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)$ is orthogonal to all of them, the result follows.
• September 13th 2009, 02:16 AM
NonCommAlg
Quote:

Originally Posted by halbard

Warning: I'm relying on the identity $\mathbf a*(\mathbf b\times\mathbf c)=(\mathbf a\times\mathbf b)*\mathbf c$. This is proved by considering the identity $(\mathbf c+\mathbf a)*[(\mathbf c+\mathbf a)\times \mathbf b]=0$.

First assume that $\mathbf u$, $\mathbf v$ and $\mathbf w$ are linearly dependent. WLOG assume there are scalars $a$, $b$ with $\mathbf w=a\mathbf u+b\mathbf v$.

Then $\mathbf u\times(\mathbf v\times\mathbf w)=a\mathbf u\times(\mathbf v\times\mathbf u)$ and $\mathbf v\times(\mathbf w\times\mathbf u)=b\mathbf v\times(\mathbf v\times\mathbf u)$ and so

$\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)=(a\mathbf u+b\mathbf v)\times(\mathbf v\times\mathbf u)=\mathbf w\times(\mathbf v\times\mathbf u)=-\mathbf w\times(\mathbf u\times\mathbf v)$, proving the result in this case.

Now assume that $\mathbf u$, $\mathbf v$ and $\mathbf w$ are linearly independent, thus forming a basis for $\mathbb R^3$.

We have $\mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)]=0$ and $\mathbf u*[\mathbf v\times(\mathbf w\times\mathbf u)]=(\mathbf u\times\mathbf v)*(\mathbf w\times\mathbf u)=-\mathbf u*[\mathbf w\times(\mathbf u\times\mathbf v)]$.

Hence $\mathbf u*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Similarly $\mathbf v*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$ and $\mathbf w*[\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)]=0$.

Since $\mathbf u$, $\mathbf v$ and $\mathbf w$ form a basis and $\mathbf u\times(\mathbf v\times\mathbf w)+\mathbf v\times(\mathbf w\times\mathbf u)+\mathbf w\times(\mathbf u\times\mathbf v)$ is orthogonal to all of them, the result follows.

nice! (Clapping)
• October 7th 2009, 10:38 PM
Bruno J.
Here is a proof. Consider the special orthogonal group $SO(3, \mathbb{R})$ as a Lie group. Its Lie algebra $\bold{so}(3, \mathbb{R})$ is the set of $3\times 3$ skew-symmetric matrices, and the Lie bracket is the commutator $[A,B]=AB-BA$. A $3\times 3$ skew-symmetric matrix $A$ can be associated in a natural way with a vector $a$ in $\mathbb{R}^3$, and it is easy to check that the vector associated to the bracket $[A,B]$ is $a \times b$. By the Jacobi identity for the Lie bracket, the result follows.
• October 8th 2009, 12:20 AM
NonCommAlg
Quote:

Originally Posted by Bruno J.
Here is a proof. Consider the special orthogonal group $SO(3, \mathbb{R})$ as a Lie group. Its Lie algebra $\bold{so}(3, \mathbb{R})$ is the set of $3\times 3$ skew-symmetric matrices, and the Lie bracket is the commutator $[A,B]=AB-BA$. A $3\times 3$ skew-symmetric matrix $A$ can be associated in a natural way with a vector $a$ in $\mathbb{R}^3$, and it is easy to check that the vector associated to the bracket $[A,B]$ is $a \times b$. By the Jacobi identity for the Lie bracket, the result follows.

this is a clever solution with lots of details! (Nod) in simple words: first we note that if $S \subseteq R,$ where $R$ is an associative ring, and if we define $[s,t]=st - ts,$ for all $s,t \in S,$ then the

Jacobi identity holds, i.e. $[r,[s,t]] + [s,[t,r]] + [t,[r,s]]=0, \ \ \forall r,s,t \in S.$ now let $S$ be the set of skew-symmetric matrices with real entries, i.e. $S \subset M_3(\mathbb{R})$ consists of all matrices

in the form $A_{x,y,z}=\begin{pmatrix}0 & -x & -y \\ x & 0 & -z \\ y & z & 0 \end{pmatrix}, \ \ \ x,y,z \in \mathbb{R}.$ define the map $f: S \longrightarrow \mathbb{R}^3$ by $f(A_{x,y,z})=x \bold{i} + y \bold{j} + z \bold{k}.$ clearly $f$ is an isomorphism of vector spaces and it's easy to see that $f([A_{x,y,z}, A_{x',y',z'}])=(x \bold{i} + y \bold{j} + z \bold{k}) \times (x' \bold{i} + y' \bold{j} + z' \bold{k}).$ call this result $(1).$ since $M_3(\mathbb{R})$ is an associative ring, we have $[A,[B,C]]+[B,[C,A]]+[C,[A,B]]=0, \ \forall A,B,C \in S.$

taking $f$ of both sides and using the linearity of $f$ and $(1)$ gives us: $f(A) \times (f(B) \times f(C)) + f(B) \times (f(C) \times f(A)) + f(C) \times (f(A) \times f(B))=0,$ which completes the proof.