# lie algebra

• December 16th 2011, 01:46 AM
vernal
lie algebra
Who can solve this question?

1. Let L be a complex Lie algebra . Show that L is nilpotent if and only if every 2- dimensional subalgebra of L is abelian. ( use the second version of engel's Theorem.)

I can not get resolved:(:(:(
• December 17th 2011, 03:13 PM
NonCommAlg
Re: lie algebra
Quote:

Originally Posted by vernal
Who can solve this question?

1. Let L be a complex Lie algebra . Show that L is nilpotent if and only if every 2- dimensional subalgebra of L is abelian. ( use the second version of engel's Theorem.)

I can not get resolved:(:(:(

you should post questions like this in the algebra section if you want to get a faster response.

anyway, suppose first that every $2$-dimensional subalgebra of $L$ is abelian and let $a \in L.$ we only need to show that $\text{ad}(a)$ is nilpotent because then we will be done by Engel's theoem. so suppose that $\lambda \in \mathbb{C}$ is an eigenvalue of $\text{ad}(a)$ and $x \in L$ is an eigenvector corresponding to $\lambda.$ then $[a,x] = \lambda x.$ so the subalgebra generated by $\{a,x\}$ would be at most $2$-dimensional and thus, by our hypothesis, abelian (note that 1-dimensional algebras are always abelian). hence $\lambda x = [a,x]= 0,$ which implies $\lambda = 0.$ so every eigenvalue of $\text{ad}(a)$ is zero and therefore $\text{ad}(a)$ is nilpotent.
conversely, suppose that $L$ is nilpotent. so $L^{n+1} = 0,$ for some integer $n \geq 1.$ now, let $A$ be a subalgebra of $L$ and $\dim_{\mathbb{C}} A = 2.$ suppose that $A$ is not abelian. then there exist $a,b \in A$ such that $[a,b] = \lambda a + \mu b \neq 0,$ for some $\lambda, \mu \in \mathbb{C}$ and we may assume that $\lambda \neq 0.$ so $[[a,b],b] = \lambda [a,b]$ and $[[[a,b],b],b] = \lambda^2[a,b]$ and, if we continue this process, we will eventually get $\lambda^n[a,b] = 0$ because $L^{n+1}=0.$ but $\lambda \neq 0$ and $[a,b ] \neq 0$ and so $\lambda^n [a,b] \neq 0.$ this contradiction proves that $A$ is abelian.
• December 17th 2011, 09:48 PM
vernal
Re: lie algebra
Quote:

Originally Posted by NonCommAlg
you should post questions like this in the algebra section if you want to get a fater response.

anyway, suppose first that every $2$-dimesnional subalgebra of $L$ is abelian and let $a \in L.$ we only need to show that $\text{ad}(a)$ is nilpotent because then we will be done by Engel's theoem. so suppose that $x \in L$ is an eigenvector of $L$. then there exists and $\lambda \in \mathbb{C}$ such that $[a,x] = \lambda x.$ but then the subalgebra generated by $\{a,x\}$ would be at most $2$-dimensional and thus, by our hypothesis, abelian (note that 1-dimensional algebras are always abelian). hence $\lambda x = [a,x]= 0,$ which implies $\lambda = 0.$ so every eigenvalue of $\text{ad}(a)$ is zero and therefore $\text{ad}(a)$ is nilpotent.
conversely, suppose that $L$ is nilpotent. so $L^{n+1} = 0,$ for some integer $n \geq 1.$ now, let $A$ be a subalgebra of $L$ and $\dim_{\mathbb{C}} A = 2.$ suppose that $A$ is not abelian. then there exist $a,b \in A$ such that $[a,b] = \lambda a + \mu b \neq 0,$ for some $\lambda, \mu \in \mathbb{C}$ and we may assume that $\lambda \neq 0.$ so $[[a,b],b] = \lambda [a,b]$ and $[[[a,b],b],b] = \lambda^2[a,b]$ and, if we continue this process, we will eventually get $\lambda^n[a,b] = 0$ because $L^{n+1}=0.$ but $\lambda \neq 0$ and $[a,b ] \neq 0$ and so $\lambda^n [a,b] \neq 0.$ this contradiction proves that $A$ is abelian.

Thank you very much..... tanks tanks tanks!