Who can solve this question?

1. Let L be a complex Lie algebra . Show that L is nilpotent if and only if every 2- dimensional subalgebra of L is abelian. ( use the second version of engel's Theorem.)

I can not get resolved:(:(:(

Printable View

- Dec 16th 2011, 01:46 AMvernallie algebra
Who can solve this question?

1. Let L be a complex Lie algebra . Show that L is nilpotent if and only if every 2- dimensional subalgebra of L is abelian. ( use the second version of engel's Theorem.)

I can not get resolved:(:(:( - Dec 17th 2011, 03:13 PMNonCommAlgRe: lie algebra
you should post questions like this in the algebra section if you want to get a faster response.

anyway, suppose first that every $\displaystyle 2$-dimensional subalgebra of $\displaystyle L$ is abelian and let $\displaystyle a \in L.$ we only need to show that $\displaystyle \text{ad}(a)$ is nilpotent because then we will be done by Engel's theoem. so suppose that $\displaystyle \lambda \in \mathbb{C}$ is an eigenvalue of $\displaystyle \text{ad}(a)$ and $\displaystyle x \in L$ is an eigenvector corresponding to $\displaystyle \lambda.$ then $\displaystyle [a,x] = \lambda x.$ so the subalgebra generated by $\displaystyle \{a,x\}$ would be at most $\displaystyle 2$-dimensional and thus, by our hypothesis, abelian (note that 1-dimensional algebras are always abelian). hence $\displaystyle \lambda x = [a,x]= 0,$ which implies $\displaystyle \lambda = 0.$ so every eigenvalue of $\displaystyle \text{ad}(a)$ is zero and therefore $\displaystyle \text{ad}(a)$ is nilpotent.

conversely, suppose that $\displaystyle L$ is nilpotent. so $\displaystyle L^{n+1} = 0,$ for some integer $\displaystyle n \geq 1.$ now, let $\displaystyle A$ be a subalgebra of $\displaystyle L$ and $\displaystyle \dim_{\mathbb{C}} A = 2.$ suppose that $\displaystyle A$ is not abelian. then there exist $\displaystyle a,b \in A$ such that $\displaystyle [a,b] = \lambda a + \mu b \neq 0,$ for some $\displaystyle \lambda, \mu \in \mathbb{C}$ and we may assume that $\displaystyle \lambda \neq 0.$ so $\displaystyle [[a,b],b] = \lambda [a,b]$ and $\displaystyle [[[a,b],b],b] = \lambda^2[a,b]$ and, if we continue this process, we will eventually get $\displaystyle \lambda^n[a,b] = 0$ because $\displaystyle L^{n+1}=0.$ but $\displaystyle \lambda \neq 0$ and $\displaystyle [a,b ] \neq 0$ and so $\displaystyle \lambda^n [a,b] \neq 0.$ this contradiction proves that $\displaystyle A$ is abelian. - Dec 17th 2011, 09:48 PMvernalRe: lie algebra