Results 1 to 3 of 3

Thread: lie algebra

  1. #1
    Member vernal's Avatar
    Joined
    Dec 2011
    Posts
    75

    lie algebra

    Who can solve this question?

    1. Let L be a complex Lie algebra . Show that L is nilpotent if and only if every 2- dimensional subalgebra of L is abelian. ( use the second version of engel's Theorem.)

    I can not get resolved
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7

    Re: lie algebra

    Quote Originally Posted by vernal View Post
    Who can solve this question?

    1. Let L be a complex Lie algebra . Show that L is nilpotent if and only if every 2- dimensional subalgebra of L is abelian. ( use the second version of engel's Theorem.)

    I can not get resolved
    you should post questions like this in the algebra section if you want to get a faster response.

    anyway, suppose first that every $\displaystyle 2$-dimensional subalgebra of $\displaystyle L$ is abelian and let $\displaystyle a \in L.$ we only need to show that $\displaystyle \text{ad}(a)$ is nilpotent because then we will be done by Engel's theoem. so suppose that $\displaystyle \lambda \in \mathbb{C}$ is an eigenvalue of $\displaystyle \text{ad}(a)$ and $\displaystyle x \in L$ is an eigenvector corresponding to $\displaystyle \lambda.$ then $\displaystyle [a,x] = \lambda x.$ so the subalgebra generated by $\displaystyle \{a,x\}$ would be at most $\displaystyle 2$-dimensional and thus, by our hypothesis, abelian (note that 1-dimensional algebras are always abelian). hence $\displaystyle \lambda x = [a,x]= 0,$ which implies $\displaystyle \lambda = 0.$ so every eigenvalue of $\displaystyle \text{ad}(a)$ is zero and therefore $\displaystyle \text{ad}(a)$ is nilpotent.
    conversely, suppose that $\displaystyle L$ is nilpotent. so $\displaystyle L^{n+1} = 0,$ for some integer $\displaystyle n \geq 1.$ now, let $\displaystyle A$ be a subalgebra of $\displaystyle L$ and $\displaystyle \dim_{\mathbb{C}} A = 2.$ suppose that $\displaystyle A$ is not abelian. then there exist $\displaystyle a,b \in A$ such that $\displaystyle [a,b] = \lambda a + \mu b \neq 0,$ for some $\displaystyle \lambda, \mu \in \mathbb{C}$ and we may assume that $\displaystyle \lambda \neq 0.$ so $\displaystyle [[a,b],b] = \lambda [a,b]$ and $\displaystyle [[[a,b],b],b] = \lambda^2[a,b]$ and, if we continue this process, we will eventually get $\displaystyle \lambda^n[a,b] = 0$ because $\displaystyle L^{n+1}=0.$ but $\displaystyle \lambda \neq 0$ and $\displaystyle [a,b ] \neq 0$ and so $\displaystyle \lambda^n [a,b] \neq 0.$ this contradiction proves that $\displaystyle A$ is abelian.
    Last edited by NonCommAlg; Dec 18th 2011 at 01:37 AM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member vernal's Avatar
    Joined
    Dec 2011
    Posts
    75

    Re: lie algebra

    Quote Originally Posted by NonCommAlg View Post
    you should post questions like this in the algebra section if you want to get a fater response.

    anyway, suppose first that every $\displaystyle 2$-dimesnional subalgebra of $\displaystyle L$ is abelian and let $\displaystyle a \in L.$ we only need to show that $\displaystyle \text{ad}(a)$ is nilpotent because then we will be done by Engel's theoem. so suppose that $\displaystyle x \in L$ is an eigenvector of $\displaystyle L$. then there exists and $\displaystyle \lambda \in \mathbb{C}$ such that $\displaystyle [a,x] = \lambda x.$ but then the subalgebra generated by $\displaystyle \{a,x\}$ would be at most $\displaystyle 2$-dimensional and thus, by our hypothesis, abelian (note that 1-dimensional algebras are always abelian). hence $\displaystyle \lambda x = [a,x]= 0,$ which implies $\displaystyle \lambda = 0.$ so every eigenvalue of $\displaystyle \text{ad}(a)$ is zero and therefore $\displaystyle \text{ad}(a)$ is nilpotent.
    conversely, suppose that $\displaystyle L$ is nilpotent. so $\displaystyle L^{n+1} = 0,$ for some integer $\displaystyle n \geq 1.$ now, let $\displaystyle A$ be a subalgebra of $\displaystyle L$ and $\displaystyle \dim_{\mathbb{C}} A = 2.$ suppose that $\displaystyle A$ is not abelian. then there exist $\displaystyle a,b \in A$ such that $\displaystyle [a,b] = \lambda a + \mu b \neq 0,$ for some $\displaystyle \lambda, \mu \in \mathbb{C}$ and we may assume that $\displaystyle \lambda \neq 0.$ so $\displaystyle [[a,b],b] = \lambda [a,b]$ and $\displaystyle [[[a,b],b],b] = \lambda^2[a,b]$ and, if we continue this process, we will eventually get $\displaystyle \lambda^n[a,b] = 0$ because $\displaystyle L^{n+1}=0.$ but $\displaystyle \lambda \neq 0$ and $\displaystyle [a,b ] \neq 0$ and so $\displaystyle \lambda^n [a,b] \neq 0.$ this contradiction proves that $\displaystyle A$ is abelian.
    Thank you very much..... tanks tanks tanks!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 1
    Last Post: Feb 4th 2011, 08:39 AM
  2. Replies: 2
    Last Post: Dec 6th 2010, 03:03 PM
  3. Algebra or Algebra 2 Equation Help Please?
    Posted in the Algebra Forum
    Replies: 4
    Last Post: May 12th 2010, 11:22 AM
  4. Replies: 0
    Last Post: Apr 23rd 2010, 11:37 PM
  5. algebra help
    Posted in the Algebra Forum
    Replies: 2
    Last Post: Jun 20th 2007, 10:33 AM

Search Tags


/mathhelpforum @mathhelpforum